nodejs / help

:sparkles: Need help with Node.js? File an Issue here. :rocket:
1.45k stars 278 forks source link

V8/Node: To Async or not to Async, that is the question (might be a bug, specification, implementation or documentation issue). #2201

Closed joeyhub closed 4 years ago

joeyhub commented 4 years ago

Using the latest version of node I'm finding that async functions (with await) don't always work as I might expect. As far as I can tell, they're fundamentally broken.

The problem is that they run asyncronously when they shouldn't. I believe this is an easy mistake to make given they're called asyncronous. We already have something to be able to make async, that's usually either setTimeout(f, 0) or process.nextTick(f) (which is more explicit).

The background I'm coming from is that I've implemented tool kits for async in a number of languages and have used it heavily.

When I implement my own equivalent set of concurrency helpers which include equivalence functions to promises and also do things such as use generators to make async, nothing actually runs async unless it has to.

A little background might help.

Traditionally a synchronous function might appear so:

function downloadUserData(input) {
    input = JSON.stringify(input);
    const output = download(input);
    console.log(output);
}

Realistically in JS we don't usually have things like a syncronous download function because it would block execution of everything else so we need to break the function in two, and this propagates such that all functions depending on it must also be broken in two.

function downloadUserData(input, finished) {
    function part0() {
        input = JSON.stringify(input);
        download(input, part1);
    }

    function part1(output) {
        console.log(output);
        finished();
    }

    part0();
}

This could be written more concisely though is laid out like this to make the need to break functions in two more apparent. As you can see this has to now provide a passable callback to be able to let a script using it that it has finished. The propagates all the way up the call path.

All of those callbacks are called synchronously however. The only asyncbronous part will be when the callback passed to download is called. Once that is called, all the other callbacks in the chain are actually called synchronously from that point up.

To create a facility to allow this to be more easily written it should reproduce the same. If I were going to make a little babel plugin to handle await then it would basically convert it back into callback hell where it would behave the same as the split example here. The example above can be seen as being transpiled from:

async function downloadUserData(input) {
    input = JSON.stringify(input);
    const output = await download(input);
    console.log(output);
}

We're really just talking about a bit of syntactical sugar here. It's barely more than function.toString().split('await').

The problem is node.js doesn't do this. As far as I'm concerned an async function should only actually be async if on the completion of the current event (script execution) from the event queue the async function is waiting on an unresolved promise.

You might think this to be the case...

(async () => {console.log('start')})();console.log('end');

Produces: start, end

However it turns out that an async function ALWAYS becomes async if it awaits at all.

(async () => {console.log(await Promise.resolve('start'))})();console.log('end');

Produces: end, start

There's no real reason for the above to be async. Perhaps at some point there was concern about that async functions could be async or not async based on promise resolution and that would be too confusing. Though this problem already exists to some degree given an async function can already be either based on some if condition.

This to me is a fundamental design flaw. It's not only not necessary but removes some options. If instead async functions are only async when they actually need to be, if someone has a problem with that they can always force async with setTimeout but when you make it always async if awaiting even if the await isn't for something async you lose some options.

None of this actually comes to light until you start using promises and concurrency properly. One of the problems with promises and async is that people only use them for simply cases and the facilities made for more complex cases are very simplistic.

I tend to want things such as this:

// Excuse  my keywords.
const remove = new SensibleSinglePromise(), click = new SensiblePluralPromise();
Thing.on('click', click.return);
Thing.on('remove', remove.return);

while(true) {
    const [ev, {return, throw}] = await select({click.await, remove.await});
    // Do what I want here with ev and return such as switch statement..
}

There are many variations of this, this one is not the most elegant with some things done badly on purpose to express the point more clearly. You can also do click.partial('click') then simply select(click, remove) or simply then use Promise.race. In this example you might instead just rely on the Event object. I've ignored that as their are cases pass that.

The problem is with this is that whatever I do to make my life easier tends to involve the gratuitous use of promises and async functions to make plexing easier and that's where everything goes to async/await hell.

When its running resolves asynchronously that could be run directly it's immediately creating problems where I'm having to queue things to buffer them up.

Where it becomes a complete lemon is that when you break what should be a single flow of execution into several blocks on the event queue then sooner or later it allows another event to come in and get put on the queue. If that even has a shorter chain then it gets injected out of order making it a nightmare to then have to sort and reorder events or have hacks to keep the function call chain the same length which isn't happening. Reordering is problematic as it either means some nasty waiting for everything to clear first, becoming blocking or needs to propagate information in a way that's horrific to implement.

The problem is seen here because between a singular listener and a plural listener you can expect their implementations to differ with differing call stack sizes. To me this is fundamentally broken, ordering events effectively arbitrarily by callstack depth.

The quicker it's rectified the better because more and more people are going to start relying on this behaviour using some variation of (async () => await Promise.resolved())() in place of setTimeout(f, 0) for example.

This is confusing as I'm sitting there thinking why is resolve async? It's not setTimeout(f, 0), it's just the callback, or rather a wrapper around that and when you actually a callback it should run synchronously. It's as if putting setTimeout(() => setTimeout(callback, 0), 1000). It's redundant. What happens if I do setNextTick(resolve)? Is it the same as setNextTick(setNextTick(resolve))?

This isn't the only problem with async. I don't think the people specifying these things actually write code. Why would you make promise take a callback? Imposing a callback on people to escape callbacks for a library whose only purpose is to give you a callback. There's something wrong with the theory and practice here.

Every time I start a new project the first thing I do is proxy promise with something that extracts resolve and reject.

Instead of:

var p = new Promise((resolve => setTimeout(resolve)))

Why not:

var p = new Promise();
setTimeout(p.resolve);

This seems to be a problem with promises in general. Why are they running async? They're meant to run async but not run async. Did someone get something mixed up?

This is a concerns failure. Promises are doing two things. They're just meant to be a container for callbacks but they're also reimplementing setNextTick.

Because they're also being used for async the concerns are clashing. There needs to be a separate Promise that sticks to the task purely of holding onto a collection of callbacks.

ORESoftware commented 4 years ago

async/await always runs asynchronously, even if there is no await keyword present.

async function foo(){
 return 'dog'
}

run it like:

foo().then(console.log);

async/await will always return a promise, and it will only resolve the return value asynchronously.

joeyhub commented 4 years ago

async/await always runs asynchronously

That's a design flaw which appears to be linked to a design flaw in promises which unconditionally appear to induce side effects.

You can try this yourself. Implement your own promise class and you'll see that it's not necessary to include setTimeout(f, 0);.

I can whip a simple one up really quickly:

function promise(f) {
    var buffer = [], result = null;
    f((...args) => {
        const callbacks = buffer;
        buffer = null;
        result = args;
        callbacks.forEach(callback => callback(...args));
    });

    return callback => {
        if(buffer === null) {
            callback(...result);
            return;
        }

        buffer.push(callback);
    };
}

As you can see there's no need for creating a new thread. It's just returning a callback collection that will be called when there's a result. The async should be separate to that. Promise for example should extend something that does that if it wants to add other behaviour or be a different class or something.

The functionality promise does is redundant and not only that but it introduces unintended side effects. If you want that to implement a non-async promise then make it async such as (new Promise(resolve => setTimeout(0)).then(() => console.log('Now I'm async.'))).

This should not be async: (new Promise(resolve => resolve()).then(() => console.log('Boy am I sync.'))).

The point should be to wrap async not be async or wrap async with async. async/await always being async is part of the same problem. Someone got confused and thought they were meant to implement async.

Try my promise you'll see it works just as well but without the side effect. So why do you always need it? That's a coupling problem.

Mine works without it:

(new Promise(resolve => setTimeout(() => resolve('start'), 0))).then(console.log);console.log('end');
end
start
new Promise(resolve => resolve('start')).then(console.log);console.log('end');
end
start
promise(resolve => setTimeout(resolve, 0))(() => console.log('start'));console.log('end');
end
start
promise(resolve => resolve())(() => console.log('start'));console.log('end');
start
end

That's 100% proof the side effect isn't necessary and also breaks order.

ORESoftware commented 4 years ago

what side-effect are you referring to? keep it concise

joeyhub commented 4 years ago

async/await always runs asynchronously

That is the side effect.

addaleax commented 4 years ago

The quicker it's rectified the better because more and more people are going to start relying on this behaviour using some variation of (async () => await Promise.resolved())() in place of setTimeout(f, 0) for example.

Changing that would be a breaking change to the language spec and the language spec doesn’t do breaking changes at all as a principle, which kind of nips all of this in the bud really quickly.

joeyhub commented 4 years ago

The feature is broken already. It's questionable if it would be a breaking change. You go from having arbitrary order to the correct order. It's kind of hard to rely on arbitrary order, implementations should be already capable of handling any order given it's potentially arbitrary. The one caveat with that is that although arbitrary (not based on anything particularly useful) it's not random so you can't entirely rule out dependence on order based on call stack depth. A major problem is the use of it in place of next tick.

They're then going to have to implement async2/await2 keywords and Promise2 classes.

It's already a serious problem with libraries returning the broken promise implementation.

Now it's severely compounded by being embedded within the async/await keywords.

This now at the least forces you to transpile everything if you want async/await on the server taking us a step back in time.

This is absolutely shocking. They should have gotten this right before standardising it.

addaleax commented 4 years ago

The fact that await Promise.resolve() leads to an extra asynchronous delay is by design and 100 % relied upon in real-world code, changing that just isn’t possible, no matter what consensus on the “right” behaviour would be. Sorry.

joeyhub commented 4 years ago

That's a faulty design. No one relies on it. It doesn't introduce any new features.

What someone has done is combine a callback collection with setNextTick. They coupled two disparate concerns. You already have either of those and can combine them or not. The way they've combined them means you can't have one without the other. You've gone from having three choices to only one.

The issue isn't about taking something away, it's that something has been taken away that people depend on.

That you've gotten it entirely the wrong way around proves that you haven't understood the problem.

I suggest you actually try demultiplexing multiple event sources with varying cardinalities, varying call stack depths (which is intrinsic to async coding) and state dynamics dependent on order then see what happens.

This isn't a matter of consensus either. Try it and see that it is broken. My proof already demonstrates that it is unnecessary for the standard use case and where it is necessary you don't need the promise to be doing that you can optionally switch to async.

devsnek commented 4 years ago

await always results in a tick, regardless of what you pass. A very intentional and important part of promises is that you must always wait at least one tick to read a promise. This also carries security implications revolving around call stacks and re-entrancy, and it's absolutely reasonable for code to depend on it.

joeyhub commented 4 years ago

Let me demonstrate the silliness of it.

This is callback hell:

// modern
get('/a', a => post('/b', a, b => post('/c', b, c => post('/d', c))));

// historic
get('/a', function(a) {
    post('/b', a, function(b) {
        post('/c', b, function(c) {
            post('/d', c)
        })
    })
});

Now if you were reviewing code doing this for no reason:

get('/a', function(a) {
    setTimeout(function() {
        post('/b', a, function(b) {
            setTimeout(function() {
                post('/c', b, function(c) {
                    setTimeout(function() {
                        post('/d', c)
                    }, 0)
                });
            }, 0)
        })
    }, 0)
});

Would you not be questioning their sanity? Is that how you write your code when using callbacks?

Promises as a drop in replacement for callbacks does exactly that. That means it fails immediately as it fails to match that which it's replacing. It's not compatible. The above form of callback handling is what you need to justify and you need to justify it for every case. Given you can easily create cases where it's needless then it's impossible to justify it as an implicit constant unconditional side effect.

If your language requires you to code like that to achieve security then it's broken. This is opposite world. That's the polar opposite of absolutely reasonable. Are you also saying all synchronous code is insecure?

addaleax commented 4 years ago

What are you trying to achieve here? As I said, like it or not, the current behaviour is going to stay, no matter how convincing you might be be that it is wrong, unintuitive, unjustified, or any other kind of bad.

Would you not be questioning their sanity?

No, because somebody else’s sanity is not up for debate here at all.

devsnek commented 4 years ago

So back in the olden days of callbacks, a lot of APIs would accidentally fall into this pattern:

function doSomething(cb) {
  if (exitCondition) {
    cb(null); // cb called synchronously 
  } else {
    internalMagicWithIO(cb); // cb called asynchronously
  }
} 

There have been quite a few blog posts on this topic (keyword: "zalgo") so I won't go into it too much here, but a large goal is abstracting over potentially dangerous logic like the above.

In any case, Node.js is not in a position to change how promises work (contrary to what you said above, promises have very well defined behaviour and execution order), so there's not much we can do about your concerns.

joeyhub commented 4 years ago

If users make mistakes, that's on them. If you take away essential features why not take away the whole language? Then they can't get anything wrong with it. Mixing the two shouldn't be a problem. Order doesn't matter unless it does and when it does it's only using the async when necessary only approach that can preserve order. I never had a big problem keeping in mind that a call may run either sync or async.

If the default worked like that then it might not be a problem but it only works like that.

When you program as though synchronously but async then you need things like select() to achieve that fully. This design flaw makes it impossible to achieve that fully as it breaks select implementations (unless you want to pass down a representation of the call stack with everything rather than agnostic basically repeat implementing promises on top of broken promises and good luck with that).

That it only works in one way, inducing an additional side effect makes it extremely difficult to implement select, perhaps impossible in some cases given that you want it to take promises and it shouldn't have to be aware of their entire callstack down to the root to keep the order. It goes from being able to work single ended to only working double ended.

That behaviour needs to be redefined and re-implemented as an additional implementation with the previous depreciated, likely for a very long time.

Otherwise its better to just give up and go back to callback hell as at least that provides the correct result.

If you try to use something like select, notice you wont have problems like getting block reads in the wrong order.

Try to achieve the same with JS async/await. It's not immediately obvious as you have to be aiming to reach the pinnacle or final lap of completeness of being able to write async code as though synchronous.

addaleax commented 4 years ago

This repository is a place for asking for help when encountering concrete issues with Node.js. This issue, however, is explaining a perceived design flaw rather than asking for help, and in particular a perceived design flaw in the language that isn’t under the control of Node.js and that we cannot change.

I’m going to close this issue out because this really isn’t the right place for this kind of debate.

brodycj commented 4 years ago

Hi @joeyhub,

I think I can solve one of your cases, and I think it can help you and the rest of us understand what is going on here.

However it turns out that an async function ALWAYS becomes async if it awaits at all.

(async () => {console.log(await Promise.resolve('start'))})();console.log('end');

Produces: end, start

Yes.

First step is to please use highlighting like this:

```js
// some JavaScript code
```

Breaking your code into 4 lines to make it easier for people like me to understand:

(async () => {
  console.log(await Promise.resolve('start'))
})();
console.log('end');

Explanation is that this JavaScript code kicks off an asynchronous function, which can do some I/O, then just keeps going.

Here is what I want to do but it will not work:

// DOES NOT WORK:
await (async () => {
  console.log(await Promise.resolve('start'))
})();
console.log('end');

A possible workaround is to wrap that whole thing into an async IIFE, which does log start before end:

(async () => {
  await (async () => {
    console.log(await Promise.resolve('start'))
  })();
  console.log('end');
})();

I think this code is still not 100% ideal but at least shows how to do what you were trying to do.

I would highly recommend that you spend some time researching how this whole mechanism works before starting this kind of a discussion. I also find your arguments to be pretty hard to follow. I think spending some more time researching this whole subject area could help you make clear, succinct, easy-to-follow arguments about what works right and what potential pitfalls people in the community can watch out for.

joeyhub commented 4 years ago

Your solution is a misunderstanding of the problem. I have something like that for trivial cases and that only works because you're aware of the expected order and reordering it. It's an easy mistake as you see something out of order and it makes sense to just reorder it.

I have something similar where end comes in before read so to correct the order I have to tell the read iterator to end (drain) then add that to the selection and wait for that to end for the real end. That case is a bit of a cheat as I'm able to fix the order externally without too much cost.

However can you fix the order of: read, read

For cases where I want order preserved but it's not deterministic (generic aggregation) then there is nothing reasonable I can do. The user can activate one thing then another but for example there will be no way for Promise.race to preserve that order barring all the promises having the same callstack depth.

This means if for example I encapsulate events somehow then the callstack starts to become arbitrary, polymorphic often and then race will become useless if order matters which it often does.

There is also a bit of confusion on the problem versus the solution. There's no ideal solution as it has gone out flawed but I don't appreciate certain other people bullying with gaslighting insisting that a problem that I've proven to exist is merely 'perceived'.

Compatibility wise it will be a mess where people have done this...

var buffer;
download(data => buffer.push(data));
buffer = [];

This is a coding mistake (given maybe async) but works with the current implementation and breaks with the correct implementation. It's worse as the broken implementation allows broken code to accumulate. The sync/async problem exists even with the current implementation. Having if(!cached)cached=await load();maybeAsync(); is enough to bipass the await and make a maybe async callback.

This is just something users have to learn as part of the learning to code async endeavour. Attempting to preemptively fix things for them does more harm than good.

It's easy to make your own promise implementation but the async and await keywords are another thing since you're locked in with that.

If you keep using Promise.race recursively, nested, as race is essentially the same as select, then you'll start to see some problems in time.

Here is another way to see it. Make two async call stacks of different lengths and resolve them in random orders.

Use Promise.race. Did you expect that to be sort by stack depth (imagine results.sort((a, b) => a.error.stack.split('\n').length) - etc)? Does that seem particularly useful to you? Would you not prefer the events in order of original resolution?

This isn't from purely a user perspective. I also design and build async systems. This is a mistake I've almost made in design at times. It appears to fix one problem of maybe async then causes another problem down the line nuking order making it impossible to finish a full async implementation. If you replace a problem with a problem that tends to invalidate it as a solution. It's just a deferred problem deferred to the point it becomes unsolvable.

joeyhub commented 4 years ago

You can see an example here of it sorting by call stack and losing the original order of events (which matters, for example block reads from a file need to be in order, TCP packets, clicks, over and out, state transitions, etc):

/**
 * What this does:
 *
 * This shows that the current async/await and promise implementation
 * introducing the side effect of determining order by stack size.
 * It provides two async functions with the only difference being stack depth.
 * To rule out differences it resolves in both orders and races in
 * both orders.
 */
(async () => {
    const
        proxy = () => {
            var resolve, reject;

            const promise = new Promise((a, b) => {
                resolve = a;
                reject = b;
            });

            return {promise, resolve, reject};
        },
        partial = (id, proxy) => ({
            resolve: arg => proxy.resolve([id, arg]),
            reject: arg => proxy.reject([id, arg]),
            promise: proxy.promise
        }),
        order = (order, array) => array.map((_, i) => array[order[i]]),
        orders = [[0, 1], [1, 0]];

    for(const order0 of orders)
        for(const order1 of orders) {
            const
                /**
                 * The terms first and last are a contrivance.
                 * In practical use the order is deferred to the event emitter.
                 * It would be a mistake to assume the correct solution would be
                 * `await first;await last;`.
                 * Passing the index like this up the chain isn't always viable though
                 * is a work around in some cases. It would basically mean all real
                 * event sources would want to be wrapped like this. It might seem
                 * to make sense to wait if there's a gap but what about filtering
                 * and mixing different event sources?
                 * Even sort by timestamp for example ignoring two things at the
                 * same time will not tell you if there's a previous event buried on
                 * the queue.
                 */
                first = partial(0, proxy()),
                last = partial(1, proxy()),
                /**
                 * Although these are statically defined in reality the call stack
                 * would be dynamic.
                 * Even for the same event emitter there might be a variable
                 * callback depth based in if statements or forking the events
                 * into two listeners.
                 * There is an allowance with Promise.race where
                 * consumer0 is allowed to resolve an additional step that's
                 * inexplicable.
                 */
                consumer0 =
                    async () => {
                        const value = await (async () => {
                            const value = await (async () => {
                                const value = await (async() => {
                                    const value = await (async() => {
                                        const value = await (async() => {
                                            const value = await (async() => {
                                                const value = await first.promise;
                                                console.log('consumer0: Depth 6');
                                                return value;
                                            })();
                                            console.log('consumer0: Depth 5');
                                            return value;
                                        })();
                                        console.log('consumer0: Depth 4');
                                        return value;
                                    })();
                                    console.log('consumer0: Depth 3');
                                    return value;
                                })();
                                console.log('consumer0: Depth 2');
                                return value;
                            })();
                            console.log('consumer0: Depth 1');
                            return value;
                        })();
                        console.log('consumer0: Depth 0');
                        return value;
                    },
                consumer1 =
                    async () => {
                        const value = await last.promise;
                        console.log('consumer1: Depth 0');
                        return value;
                    };

            /**
             * In real use the event emitted would to this and it would
             * specify the order in doing so. For example reading a file
             * it would emit the reads (blocks) in order.
             * You'll note however that if you swap these here then the
             * order will be lost on race.
             * setTimeout ensures we're not accidentally giving Promise.race
             * two resolved promises up front which it wont know how to order.
             * What's crucial here is that the only difference between these two
             * is call stack size. You'll observe that being the 
             */
            setTimeout(() => {
                const producers = order(order0, [
                    () => {console.log('producer0: 2');first.resolve(2)},
                    () => {console.log('producer1: 3');last.resolve(3)}
                ]);
                /**
                 * This causes the events to execute in order. That'll
                 * presumably invoke the behaviour of putting the next
                 * event on the event queue rather than the promise queue.
                 * The event queue is drained after the event queue
                 * so it would run in order.
                 */
                //producers.forEach(producer => setTimeout(producer, 0));
                producers.forEach(producer => producer());
            }, 0);

            var value = 0,
                /**
                 * Order here shouldn't matter.
                 * Having it fixed at two doesn't represent real use.
                 * In reality first might be an iterator instead with
                 * no fixed length.
                 * If the emitter wraps click events for example,
                 * there's no exact number. This is normal for a
                 * stream.
                 * It could be zero, one or more. We never know
                 * which is the last.
                 */
                promises = order(order1, [consumer0(), consumer1()]),
                pending = [...promises];

            while(pending.length) {
                /**
                 * This is the equivalent to select(), taking arbitrary
                 * events and demultiplexing them into
                 * single one by one results.
                 * As a generic function it doesn't need to
                 * know about the callstack behind the promise.
                 */
                console.log(`Waiting for: ${pending.length}`);
                const [index, result] = ret = await Promise.race(pending);
                pending = pending.filter(promise => promises[order1[index]] !== promise);
                console.log(`Result: ${result}`);
                value = Math.sqrt(value + result);
            }

            console.log(`Value: ${value}`);
        }
})();

https://www.ecma-international.org/ecma-262/6.0/#sec-promise-objects

It looks like the design defect is here in the specification (not checking for context):

25.4.1.8 TriggerPromiseReactions ( reactions, argument ) The abstract operation TriggerPromiseReactions takes a collection of PromiseReactionRecords and enqueues a new Job for each record. Each such Job processes the [[Handler]] of the PromiseReactionRecord, and if the [[Handler]] is a function calls it passing the given argument. Repeat for each reaction in reactions, in original insertion order Perform EnqueueJob("PromiseJobs", PromiseReactionJob, «‍reaction, argument»). Return undefined.

Enqueue is push. Unshifting it to the queue instead would probably fix order while maintaining that the callback runs after the current tick (though would delay interrupt though you can trigger a push instead with next tick giving you control over granularity). It makes sense no one sees the problem when everyone gets everything the wrong way around and the problem is that it's the wrong way around.

devsnek commented 4 years ago

if you're curious about changes to job ordering, I have a spec compliant implementation of JavaScript here you can play with: https://github.com/engine262/engine262/blob/663619929cf1e70faa3263e29f4a5d783f009d73/src/engine.mjs#L149

also edition 6 is a few years old at this point, I recommend referencing https://tc39.es/ecma262

brodycj commented 4 years ago

https://www.ecma-international.org/ecma-262/6.0/#sec-promise-objects

Don't you mean this link: https://www.ecma-international.org/ecma-262/6.0/#sec-triggerpromisereactions

It looks like the design defect is here in the specification (not checking for context):

What do you mean by "not checking for context"?

joeyhub commented 4 years ago

By context I mean I haven't studied the entire spec for where that's invoked. It's really long. They way they write it out ends up long winded but you'll probably find EnqueueJob all over the place (any callback invocation I would assume).

brodycj commented 4 years ago

I don't understand the code 100%, but I think your point is that JavaScript promises are not always executed or resolved in a very intuitive order or manner.

I think the purpose of JavaScript promises or ECMAScript promises is to provide deterministic chaining of code that has to wait for asynchronous results, not intuitive event streaming. There are other libraries available for event streaming.

Are there any other points that you would like to make?

joeyhub commented 4 years ago

If the point of promises plus async/await is not to replace callbacks then you have to stick with callbacks or a mixture of both. They can't replace callbacks because they function differently and aren't able to do things callbacks can.

I doubt there are libraries that solve this particular problem while working with async/await and if there are they would require you to maintain the order state all the way up the call chain. If you look at how I apply partial, then that has to be done for all root events (actually the ones as the leaves of the call tree). Imagine then filtering, you'd have to offset those ids, etc.

A Promise is an object that is used as a placeholder for the eventual results of a deferred (and possibly asynchronous) computation.

Then suddenly it jumps to defining a promise not as a placeholder for the results of a deferred and possible asynchronous computation but as something itself invoking asynchronous computation.

I'd just see what happens if you add append = true to export function EnqueueJob then have it unshift if not append then just have PerformPromiseThen set it to false when calling, find out what flushes PromiseFulfillReactions to do the same and see what happens.

It's not a case of it being counter intuitive. It's a case that it looses information that it doesn't have to by doing something it doesn't have to (no functionality is lost if that feature is removed) forcing you to effectively implement your own call stack to be able to rectify it.

I believe golang has select, I'd take a look at if that loses event order or not.

When a task is put on the IO queue that's a promise resolve, I think it needs to when run if it calls any resolves they need to go on another queue, once that queue is drained, then it should return to the normal queue. What I'm seeing though are IO events going into the promise queue while it's still not drained or rather their only appears to be a single queue. This might result if some confusion trying to use promises and one queue for everything.

There's inconsistency with what they're saying and the actual outcome on both fronts. IO events are intruding the run to completion of the current promise jobs and promises aren't protected from each other due to FIFO (breadth first) or not having their own sub queues. It needs to be depth first to preserve order, that needs a stack, not a queue.

DE: this is what makes promises run after synchronous code in a function. In HTML & Node, Promise jobs run before the rest of the "event loop".

DE: In HTML and Node, the algorithm (simplified) is: run the JS code, run the Promise jobs, return to the event loop (which may call back into javascript, eg when network connection comes maybe call some event in js).

MM: Trying to understand wrt what's being said about weakrefs as an example. From what was presented early, would have assumed that all promises run to completion first, then all I/O tasks happen when there are no queued promise things.

AVK: What engines intend to implement is promise queue is drained, then weakrefs happen, then (missed).

There seems to be some confusion on this point.

SG: Main miss for stage 2 that (missed) identified last time. Semantics for waitAsync that promises are resolved in FIFO order, except if agent is already blocked. Motivation is if agent is blocked then resolving does nothing because agent is blocked.

The problem may be they have only thought of this from a scheduling perspective without considering the impact that has on maintaining the correct execution order in user space. Scheduling differently to the standard of following the call stack to run to completion should be an option exposed to the user rather than imposed on the user.

To work around it can involve all kinds of hacks. For example you might choose to buffer longer than a call stack is likely to be with something that wastes CPU and adds latency by ignoring events until enough ticks have passed to be sure everything before then is resolved then having to transmit ids up from every original event source to permit reordering. With this JS quickly loses the latency benefits of being async. You'll get far superior results returning to callbacks.

If I implement a "fixed" Promise.race with a say 100 tick buffer (configurable) people are going to look at that and say just what the hell am I on? JavaScript mate. What happens when I have to do that recursively?

Also a correction. Async/await does not run in the next tick. It runs in a tick at some arbitrary point later.

joeyhub commented 4 years ago

A partial ironic solution to this is to call await timeout(0) to split potentially multiple events calling promise returns from a single IO event source (I assume this is what was happening) to force the promise chain out of the promise queue and into the IO queue. For multiple listeners on the same forking root async function the order still appears to be determined by stack depth so internal events could still potentially get jumbled up later. If I put it everywhere naively I'd just be transplanting from one queue to another. Normally it should be the other way around with await timeout(0) where ever you want to defer (break up) execution and allow other things in.