denoland / deno

A modern runtime for JavaScript and TypeScript.
https://deno.com
MIT License
97.66k stars 5.38k forks source link

Deno allows opening too many files by moving to workers; forces OS OOM #8562

Closed ghost closed 3 years ago

ghost commented 3 years ago

Normally, when sending too many fetches, I would get the following error:

error: Uncaught Error: Uncaught Http: error sending request for url (URL): error trying to connect: tcp connect error: Too many open files (os error 24) at WorkerImpl.#poll ($deno$/web/workers.ts:210:17) exit status 1

(I was spamming a URL with an infinite loop of HTTP requests)

Yet, when moving the logic into a worker, sending 400 requests at a time, terminating the worker, and then opening the worker again, in a loop, Deno does not catch it at all.

Instead, it allows the memory consumption to build up.

main file (JavaScript):

const { href } = new URL(
    "./worker.js",
    import.meta.url
);

const workerOptions = {
    type: "module",
    deno: true
};

const listenerOptions = {
    passive: true,
    once: true
};

let numberOfHTTPRequestsSent = -400;

(function listener(evt) {
    evt?.currentTarget.terminate();

    console.clear();

    console.log(
        "%i HTTP requests sent during this session",
        numberOfHTTPRequestsSent += 400
    );

    new Worker(
        href,
        workerOptions
    ).addEventListener(
        "message",
        listener,
        listenerOptions
    );
})(null);

worker:

const arr = new Array(400)
    .fill("https:// URL not disclosed for privacy reasons")
    .map(fetch);

await Promise.all(arr);

self.postMessage(null);

Alternatively, is there a way that I may manually clear this memory using any of Deno's APIs? Is there an API that would fire a request, without keeping it open, and clearing the memory and connection afterward?

bartlomieju commented 3 years ago

@00ff0000red this is a known bug in fetch (https://github.com/denoland/deno/issues/4735) if you consume the body in any way the underlying connection will be closed and cleaned up.

ghost commented 3 years ago

@00ff0000red this is a known bug in fetch (#4735) if you consume the body in any way the underlying connection will be closed and cleaned up.

Ahh, that makes much more sense, it's not leaving the files open like the first error,

error: Uncaught Error: Uncaught Http: error sending request for url (URL): error trying to connect: tcp connect error: Too many open files (os error 24)

instead, it's literally building up memory that isn't cleared.

I'll try consuming the body and seeing if it is fixed, if so, I'll close this in favor of the other issue.

ghost commented 3 years ago

Update: consuming the body seems to have had no effect whatsoever.

const arr = new Array(400)
    .fill("https:// URL not disclosed for privacy reasons")
    .map(fetch);

const responses = await Promise.all(arr);

const buffers = responses.map(
    response => response.arrayBuffer()
);

await Promise.all(buffers); // wait for response bodies to be consumed

self.postMessage(null); // kill worker

The program runs out of memory at the same time as it did before.

Update 2: I can run that on the main thread now, but it's still broken when running on the other thread on and off, maybe the GC isn't getting it in time?

bartlomieju commented 3 years ago

@00ff0000red could open a PR with presented reproduction as a test? I will look into this problem some time in the future

ghost commented 3 years ago

@bartlomieju I don't think it's possible to create an automated test for this; Deno died after accumulating too much memory, there was no runtime exception that I could simply catch in JavaScript.

Also, I had since deleted or rewritten to code to avoid the bug so... I'll see if I can replicate it myself again :)

Lastly, do you have any server that I could just spam with eight-thousand HTTP requests?

ghost commented 3 years ago

Update: I had practically never updated Deno and just moved to 1.5.4, the worker code doesn't even work, Promise.all on the buffers wasn't resolving, I tried this:

buffers.map(
    async (promise, i) => {
        await promise;
        console.log(i);
    }
);

it quickly logs up to 201 and stops entirely.

Lowering iterations to 200 instead of 400 worked.

ghost commented 3 years ago

@bartlomieju This issue cannot be caused on the current version of Deno; this seems to only exist in outdated versions of Deno.

Sorry for wasting anyone's fime.

bartlomieju commented 3 years ago

@bartlomieju This issue cannot be caused on the current version of Deno; this seems to only exist in outdated versions of Deno.

Sorry for wasting anyone's fime.

Glad to hear that it's no longer a problem! No worries, bug reports are always appreciated