Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large increase in memory usage in v9.4.0 #690

Closed
2 tasks done
voldern opened this issue Jan 3, 2019 · 18 comments
Closed
2 tasks done

Large increase in memory usage in v9.4.0 #690

voldern opened this issue Jan 3, 2019 · 18 comments
Labels
bug Something does not work as it should external The issue related to an external project regression Something does not work anymore

Comments

@voldern
Copy link

voldern commented Jan 3, 2019

Describe the bug

  • Node.js version: 10.12
  • OS & version: Docker node:10.12-slim

Problem

We started seeing a large increase in memory usage in our code after updating to the latest version of got in code that is running a few hundred requests a second. After testing every commit between v9.3.0 and v9.4.0 we were able to narrow it down to this one: https://github.com/sindresorhus/got/pull/659/files.

In our setup we are running got configured with timeout, retries and http-keepalive.

Code to reproduce

I'm working on trying to isolate the case into one that is reproducible outside of our code base. It seems to start leaking when there are requests that timeout.

I'm creating this issue now in case someone is able to look at the commit above and understand why it causes memory usage to increase.

Question

What is the purpose of setting request.setTimeout(0) after the request is canceled? Could it be that 0 in this case means "no timeout", so that we end up in a situation where there are a lot more connections open consuming memory?

Checklist

  • I have read the documentation.
  • I have tried my code with the latest version of Node.js and Got.
@szmarczak
Copy link
Collaborator

What is the purpose of setting request.setTimeout(0) after the request is canceled?

request.setTimeout(0) removes the timeout.

@szmarczak
Copy link
Collaborator

We started seeing a large increase in memory usage

Can you provide any data? How much?

@voldern
Copy link
Author

voldern commented Jan 4, 2019

Can you provide any data? How much?

In v9.3.2 it stays around 130mb, in v9.4.0 it continues to rise to 1gb over time and the process is killed because it consumes all of the available memory. It will only reduce the memory used after a while if we stop sending traffic to the container, so there seems to be something that holds on to the memory for a while.

After a while we also see the following messages, not sure if that is directly related to the issue:
(node:1) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit

@sindresorhus
Copy link
Owner

Run Node.js with $ node --trace-warnings to get a stack trace of where the event emitter leak warning is coming from.

@voldern
Copy link
Author

voldern commented Jan 4, 2019

I was able to create a concise example that reproduces the issue: https://github.com/voldern/got-memory-leak.

Check out the repo above, run docker stats and then run docker-compose up. You should see the memory for the "app" container keep rising until the memory limit is reached.

The issue is not reproducible when not using the agentkeepalive or without timeout.socket set.

Run Node.js with $ node --trace-warnings to get a stack trace of where the event emitter leak warning is coming from.

app_1     | (node:1) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
app_1     |     at _addListener (events.js:243:17)
app_1     |     at Socket.addListener (events.js:259:10)
app_1     |     at Socket.Readable.on (_stream_readable.js:799:35)
app_1     |     at Socket.once (events.js:290:8)
app_1     |     at ClientRequest.req.on (_http_client.js:669:14)
app_1     |     at ClientRequest.emit (events.js:187:15)
app_1     |     at ClientRequest.origin.emit.args [as emit] (/usr/src/app/node_modules/@szmarczak/http-timer/source/index.js:36:11)
app_1     |     at tickOnSocket (_http_client.js:654:7)
app_1     |     at onSocketNT (_http_client.js:693:5)
app_1     |     at process._tickCallback (internal/process/next_tick.js:63:19)

@timdp
Copy link

timdp commented Jan 4, 2019

Same with v9.5.0.

@szmarczak szmarczak added bug Something does not work as it should regression Something does not work anymore and removed bug Something does not work as it should labels Jan 4, 2019
@szmarczak
Copy link
Collaborator

Indeed, somehow cancelers.push(() => request.setTimeout(0)); causes memory leak...

@timdp
Copy link

timdp commented Jan 5, 2019

  • Now that we know how to reproduce it, can we add a test? Tests for memory leaks aren't pleasant to write but I've found the weak package to be a great start.

  • Socket#setTimeout returns this so we're returning the socket itself from the canceler. Seems like bad style.

  • I think you just want to explicitly remove the timeout listener in the canceler. Removing the timeout setting itself will still keep the listener attached to the socket, so any implementation that reuses sockets is going to leak.

  • I think I've mentioned this before but I wrote the event-registry package to deal with bulk-removing EventEmitter listeners. It's basically a formalization of the cancelers.

rarkins added a commit to renovatebot/renovate that referenced this issue Jan 8, 2019
@szmarczak
Copy link
Collaborator

After removing request.setTimeout(...) and leaving cancelers.push(() => request.setTimeout(0)) I get:

(node:17772) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
    at _addListener (events.js:246:17)
    at Socket.addListener (events.js:262:10)
    at Socket.Readable.on (_stream_readable.js:826:35)
    at Socket.once (events.js:291:8)
    at listenSocketTimeout (_http_client.js:673:16)
    at ClientRequest.setTimeout (_http_client.js:736:3)
    at cancelers.push (C:\Users\Szymon Marczak\Desktop\Workspace\got\source\utils\timed-out.js:93:12)
    at cancelers.forEach.cancelTimeout (C:\Users\Szymon Marczak\Desktop\Workspace\got\source\utils\timed-out.js:75:38)
    at Array.forEach (<anonymous>)
    at IncomingMessage.cancelTimeouts (C:\Users\Szymon Marczak\Desktop\Workspace\got\source\utils\timed-out.js:75:13)

@szmarczak
Copy link
Collaborator

szmarczak commented Jan 10, 2019

@szmarczak szmarczak added the external The issue related to an external project label Jan 10, 2019
@timdp
Copy link

timdp commented Jan 11, 2019

So this should be reported to nodejs/node?

@szmarczak
Copy link
Collaborator

@timdp Yup. Would you be interested in submitting an issue?

@timdp
Copy link

timdp commented Jan 11, 2019

Sure. I'll get on it later today.

@timdp
Copy link

timdp commented Jan 11, 2019

I'm not crazy about how you call makeRequest() recursively in your PoC, so I had a stab at writing my own version. Interestingly, it doesn't have the issue. Are you sure you're interpreting the results correctly?

But you were saying earlier that #694 doesn't fix it?

@szmarczak
Copy link
Collaborator

I had a stab at writing my own version. Interestingly, it doesn't have the issue.

I'll look at it.

Are you sure you're interpreting the results correctly?

Yeah, I'm sure. I'll try to rewrite @voldern's example to use only the native http module.

But you were saying earlier that #694 doesn't fix it?

Please quote? I don't recall saying anything like that.

@szmarczak
Copy link
Collaborator

@timdp Updated the gist. Added an explaination. I hope there's everything clear now. There is no recursion anymore! :D

@timdp
Copy link

timdp commented Jan 14, 2019

Thanks for the clarification and the patch. 👍 I'll turn it into a bug report with Node today.

@J9B10
Copy link

J9B10 commented May 16, 2024

I got here through a link in a Got file "timed-out.js".

I'm a beginner and just like a curious cat I found things in the archives.

About notice:
// request.setTimeout(0) causes a memory leak.

Line 61: \got\dist\source\core\timed-out.js

I saw this warning about memory leak.

I looked more about the use of this function and found a file that uses this problematic function when using http2.

Line 659: \got\dist\source\core\index.js

This file "index.js" is using the function as specified as problematic "request.setTimeout(0)" on line 659.

So I don't understand if the code is correct or someone forgot to correct this function.

My application has a random memory leak that happens after a few hours of use by calling an external api using Got with http2 enabled.

Could this have to do with the memory leak I get?

Got 14.2.1
Node 22.2.0.0

I'm just a beginner in programming.
I don't know if I'm posting this in the right or wrong place.
Sorry for my English as I'm using automatic translators.

I report this in #2351

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something does not work as it should external The issue related to an external project regression Something does not work anymore
Projects
None yet
Development

No branches or pull requests

5 participants