Closed ghost closed 3 years ago
@00ff0000red this is a known bug in fetch
(https://github.com/denoland/deno/issues/4735) if you consume the body in any way the underlying connection will be closed and cleaned up.
@00ff0000red this is a known bug in
fetch
(#4735) if you consume the body in any way the underlying connection will be closed and cleaned up.
Ahh, that makes much more sense, it's not leaving the files open like the first error,
error: Uncaught Error: Uncaught Http: error sending request for url (URL): error trying to connect: tcp connect error: Too many open files (os error 24)
instead, it's literally building up memory that isn't cleared.
I'll try consuming the body and seeing if it is fixed, if so, I'll close this in favor of the other issue.
Update: consuming the body seems to have had no effect whatsoever.
const arr = new Array(400)
.fill("https:// URL not disclosed for privacy reasons")
.map(fetch);
const responses = await Promise.all(arr);
const buffers = responses.map(
response => response.arrayBuffer()
);
await Promise.all(buffers); // wait for response bodies to be consumed
self.postMessage(null); // kill worker
The program runs out of memory at the same time as it did before.
Update 2: I can run that on the main thread now, but it's still broken when running on the other thread on and off, maybe the GC isn't getting it in time?
@00ff0000red could open a PR with presented reproduction as a test? I will look into this problem some time in the future
@bartlomieju I don't think it's possible to create an automated test for this; Deno died after accumulating too much memory, there was no runtime exception that I could simply catch in JavaScript.
Also, I had since deleted or rewritten to code to avoid the bug so... I'll see if I can replicate it myself again :)
Lastly, do you have any server that I could just spam with eight-thousand HTTP requests?
Update: I had practically never updated Deno and just moved to 1.5.4, the worker code doesn't even work, Promise.all on the buffers wasn't resolving, I tried this:
buffers.map(
async (promise, i) => {
await promise;
console.log(i);
}
);
it quickly logs up to 201
and stops entirely.
Lowering iterations to 200 instead of 400 worked.
@bartlomieju This issue cannot be caused on the current version of Deno; this seems to only exist in outdated versions of Deno.
Sorry for wasting anyone's fime.
@bartlomieju This issue cannot be caused on the current version of Deno; this seems to only exist in outdated versions of Deno.
Sorry for wasting anyone's fime.
Glad to hear that it's no longer a problem! No worries, bug reports are always appreciated
Normally, when sending too many fetches, I would get the following error:
(I was spamming a URL with an infinite loop of HTTP requests)
Yet, when moving the logic into a worker, sending 400 requests at a time, terminating the worker, and then opening the worker again, in a loop, Deno does not catch it at all.
Instead, it allows the memory consumption to build up.
main file (JavaScript):
worker:
Alternatively, is there a way that I may manually clear this memory using any of Deno's APIs? Is there an API that would fire a request, without keeping it open, and clearing the memory and connection afterward?