rocket-connect / for-emit-of

Turn Node.js Events into Async Iterables.
https://www.npmjs.com/package/for-emit-of
MIT License
9 stars 2 forks source link

Keeping alive iterable with no timeout #8

Closed Farenheith closed 4 years ago

Farenheith commented 4 years ago

Closes #7

To keep the async iterable with no timeout alive, a setTimeout is done from time to time. This setTimeout only happens if no timeout is specified, as the timeout mechanic is also enough to keep the process running.

Look that I did not create a test case for this problem as I couldn't figure out how to for two reasons:

danstarns commented 4 years ago

I feel we should keep the current behavior if keepAlive is not set, this way we don't introduce any breaking changes.

Farenheith commented 4 years ago

@danstarns the thing is that with your previous strategy that used only the sleep(0), this situation won't happen, so, I introduced a breaking change when the Promise.race strategy started. It does bring a performance improvement as it demands less processing as we give the responsibility to check the next event emitting to node but it brought this collateral effect.

So, maybe we can let the option to set the keepAlive to 0, for example, and, in this situation, the lib user explicitly set the behavior to the current one. What do you think?

danstarns commented 4 years ago

With your suggestion, If I were to set keepAlive to 0 with the code below;

async function test() {
  const neverEmit = new EventEmitter();

  for await (const item of forEmitOf(neverEmit, { keepAlive: 0 })) {
    console.log(`never will print ${item}`);
  }

  console.log("gets here");
}
test();

Would I see this in log?

console.log("gets here");
Farenheith commented 4 years ago

With my suggestion, the keepAlive to 0 would cause the process to die and you'd not see the log. This is what's happening now, but I think it looks mostly like an undesirable behavior than a feature, I can't imagine a scenario where this is needed.

PS: With the code as it is now in this PR, the behavior you described will happen.
PS2: Actually no, as the eventEmitter is never ended

danstarns commented 4 years ago

@Farenheith I see, something feels odd about manually halting the process. I feel we should keep the default behavior as near to 'normal' as possible. I can't imagine a scenario either.

Farenheith commented 4 years ago

I agree, but I feel the need to clarify a point that maybe I failed before: I saw the issue I opened and I think I misused the word "hang".

What happens is that, without this keepAlive control, the code example just shut down abruptly:

async function test() {
  const neverEmit = new EventEmitter();

  console.log("before start");
  for await (const item of forEmitOf(neverEmit, { keepAlive: 0 })) {
    console.log(`never will print ${item}`);
  }

  console.log("gets here");
}
test();

Output:

before start

And then, after the first output, Node finishes executing. I thought it'd be odd to write the code above and it getting finished before it gets to the end. This happen because, for some reason, promises created like that:

new Promise((resolve) => emitter.on('end', resolve);

Are not enough to keep Node running. At this point of the code:

const winner = await Promise.race(getRaceItems());

Promises like the one I said are actually all we have, so, in a project that uses this package, if no other promise is running (examples: a setTimeout, an express application, some queue consumer, a client connected to a socket, etc.), the process will just shutdown.

But I agree that this is the default behavior and this situation is very rare to occur. I also agree that the solution I give is not smelling so good, is kind of a workaround. I dealt with a scenario like that in the past. I honestly don't remember all the details but I had a console application which, at the start, created a promise like that and some other operations began. The application, so, just finished execution without explanation, and debugging it I discovered this situation. I really believe it's a Node issue as it should not shut down the process if it has an unresolved promise. My concern is that, if someone uses this library, maybe he'll not be aware this situation can occur and he'll have a hard time figuring it out.

What do you think?

Edit1: I looked up for this issue in the node project and I found it here Turns out this is working as intended, so, maybe it's important to secure this case.

Farenheith commented 4 years ago

@danstarns I took a deeper look into the problem to have an idea of how it is processing expensive due to the strategy I'm proposing and I'd like to endorse this choice. The code I tested is like the following:

const {
    EventEmitter
} = require("events");
const forEmitOf = require("./dist");
const {
    sleep
} = require("./dist/sleep");

async function test() {
    const neverEmit = new EventEmitter();

    console.log("for start");
    setTimeout(() => neverEmit.emit("end"), 10000);

    for await (const item of forEmitOf(neverEmit, {
        debug: true,
    })) {
        console.log(`never will print ${item}`);
    }
    console.log("for end");
}
test();

Also, I created some debug console.log into the code with an option to enable or disable it, let me know if you think it's a good idea to keep it in the code.

The result observed is very good. Take a look:

$ node test.js                       
for start
No more results to yield but emitter still active. Starting timeout race
1 keepalive cycles, 1.002839398 secs, 1.00 cycles per second
2 keepalive cycles, 2.004936182 secs, 1.00 cycles per second
3 keepalive cycles, 3.005374469 secs, 1.00 cycles per second
4 keepalive cycles, 4.006765173 secs, 1.00 cycles per second
5 keepalive cycles, 5.008340608 secs, 1.00 cycles per second
6 keepalive cycles, 6.009823084 secs, 1.00 cycles per second
7 keepalive cycles, 7.010428372 secs, 1.00 cycles per second
8 keepalive cycles, 8.011966104 secs, 1.00 cycles per second
9 keepalive cycles, 9.013585252 secs, 1.00 cycles per second
Finished response racing. Winner: undefined
for end
Finishing keep-alive control: 9 keepalive cycles, 10.015076649 secs, 0.90 cycles per second

Take a look into the points I inserted the debug. During the 10 seconds test, the only lines executed was:

Also, about the keepalive:

As so, this strategy delivers a pretty good result performance, with almost no CPU demanding. So, I was not very sure of the solution until now, but after this test, I'm very confident in it!

danstarns commented 4 years ago

@Farenheith Hey!

Nice work in researching this, I have been playing around with it. My question is, how can I disable this keepalive feature ? I'm still in 2 minds about its default, feeling more it should be disabled by default.

I like the idea of Debug, could we leverage an existing package such as debug so we can prefix the logs? Makes them easier to see in a busy terminal.

Farenheith commented 4 years ago

I can let it disabled by default, but keep in mind this behavior didn't happen in version 1.0.4: it was introduced after the use of Promise.race, so, I really think it's better to keep it enabled by default, even for a matter of semver.

I just sent a commit where, if you let it as 0, the keepAlive will be made, and I also let the default as 0, so, if you're sure that's the best setup, it can go on.

About the debug, I think it's a very good idea. I created something in a hurry just to check what I needed.

danstarns commented 4 years ago

@Farenheith It kinda sucks about the semver situation, what do you think the best version to release with your feature? Could it be classed as a fix from 1.0.4 🤷‍♂️

This is all working as intended for me :)

Farenheith commented 4 years ago

As very rarely a lib user will fall into a situation where the keepAlive will be needed, maybe keeping it disabled by default is the best choice.

After all, we put some tsdocs in the option properties, that can give some orientation for such situation

danstarns commented 4 years ago

@Farenheith It did make me think if all different ways someone would want it, In-particular with dynamic subscriptions. I'll tag this 1.3.0, It shall be live soon. 🍻