tc39 / proposal-async-iteration

Asynchronous iteration for JavaScript
https://tc39.github.io/proposal-async-iteration/
MIT License
859 stars 44 forks source link

Add some example for the practical usage ? #135

Open qiulang opened 3 years ago

qiulang commented 3 years ago

Hi I know MDN has examples of using for-await-of but to me, those examples are not compelling enough to convince me using for-await-of. I just can not think of a scenario that I have an array of promises that I need to resolve them one by one.

I get the feeling that maybe it is like some syntactic sugar that without it the codes would become unnecessary complicated, but only this time I don't see that happen (yet).

I asked the same question at SO https://stackoverflow.com/questions/67103606/some-practical-example-of-asynchronous-iterationfor-await-of-in-action

But no one answers it and my question was even down-voted. So I open an issue here hoping someone may answer my question.

Thanks!

yglbhdy commented 3 years ago

你好, 这大概是我18年的时候所做的东西,现在时间已经很久了,所以我可能不太能帮助到你了。 我现在所做的工作方向是前端,所以我可能不能帮助你了。你可以找其他的相关人寻求帮助,sorry!

发自我的iPhone

------------------ Original ------------------ From: qiulang @.> Date: Fri,Apr 16,2021 10:45 AM To: tc39/proposal-async-iteration @.> Cc: Subscribed @.***> Subject: Re: [tc39/proposal-async-iteration] Add some example for the practical usage ? (#135)

Hi I know MDN has examples of using for-await-of but to me, those examples are not compelling enough to convince me using for-await-of. I just can not think of a scenario that I have an array of promises that I need to resolve them one by one.

I get the feeling that maybe it is like some syntactic sugar that without it the codes would become unnecessary complicated, but only this time I don't see that happen (yet).

I ask the same question at SO https://stackoverflow.com/questions/67103606/some-practical-example-of-asynchronous-iterationfor-await-of-in-action

But no one answered it and my question was even down-voted. So I open an issue here hoping someone may answer my question.

Thanks!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or unsubscribe.

Jamesernator commented 3 years ago

have an array of promises that I need to resolve them one by one.

An array (or sync iterable) of promises if more of a bonus, the primary usage for async iterables and the for-await-of syntax is for things like streams.

Currently there's not a lot of builtin stuff that actually exposes themselves as async iterables, so if you're not using async generators for anything you probably won't have hit any, but recently the streams spec allowed ReadableStream to be an async iterable so when browsers ship that that'll be (one of?) the first builtin object that uses it other than async generators.

As an example of how you might use it with ReadableStream, if you fetched a bunch of lines representing messages or whatever, you could consume it in chunks and update things as they arrived fairly cleanly compared to callbacks:

// Async generators can be used as async iterables too
async function* linesFromStream(stream) {
    const textDecoder = new TextDecoder("utf8");
    let partialLine = '';
    // Chunks come in asynchronously, so we use a for-await-of loop
    for await (const chunk of stream) {
        const partialString = textDecoder.decode(chunk, { stream: true });
        const lines = partialString.split(/\n/g);
        // The first "line" from this chunk, is actually part of the previous
        // unfinished line, e.g. if we had chunks "foo\nbar" "bar\nbaz", then 
        // partialLine would be "bar" and the first bit of this chunk would be
        // "bar", so join them into a single line
        yield partialLine + lines.shift();
        // The last "line" in this chunk might not be complete, so store it in case
        // the next chunk has more of this line
        partialLine = lines.pop();
        // The line segments inbetween can just be emitted directly
        for (const line of lines) {
            yield line;
        }
    }
    // If we're out of data, what is left is the last line
    yield partialLine;
}

const messageContainer = document.getElementById("messages");

// Fetch a message list
const response = await fetch("/messageList");
// We can use for-await-of for async generators too
// Note that unlike using response.text(), we actually add
// messages to the DOM as they arrive, rather than once all
// of them have arrived, in fact this stream doesn't actually have
// to ever finish, it'll keep adding messages as long as the server
// keeps sending them
for await (const line of linesFromStream(response.body)) {
    const message = document.createElement('div');
    message.textContent = line;
    messages.append(message);
}
Jamesernator commented 3 years ago

Also when combined with async generators, you can create some fairly efficient lazy structures (and really easy to write once you know how), for example in Node.js suppose you wanted a thing that listed all files in every directory (recursively) you could do:

import fs from "fs/promises";

async function* findAllFiles(directory: string) {
    const dirEntries = await fs.readdir(directory, { withFileTypes: true });
    for (const dirEntry of dirEntries) {
        if (dirEntry.isFile()) {
            yield path.join(directory, dirEntry.name);
        } else if (dirEntry.isDirectory()) {
            yield* findAllFiles(path.join(directory, dirEntry.name));
        }
    }
}

for await (const file of findAllFiles("/")) {
    if (file.match(/that-file-i-was-looking-for/)) {
        console.log(`Found it!`);
        console.log(file);
        break;
    }
}

What makes these particularly better than doing it via loading all the entries into an array and returning a Promise for the array, is that they aren't actually all stored at once so if the list is extremely long (such as can be the case for entire file systems) you only actually receive one item at a time, the memory usage is merely the directory depth rather than the number of files.

Another advantage of this approach, is that you don't process what you don't need, when break is used in the above loop the generator function actually just stops. It won't continue past the last file which was yield-ed, and as such we don't wind up searching the entire file system if we find what we need early on.

Strictly speaking you could do all of this will more classical things like events, callbacks, manual polling, or queue systems, but for-await-of just bundles the looping and cancellation into a nice little bundle so you don't have to worry about forgetting to stop the source when you're done.

qiulang commented 3 years ago

Thanks for replying my question. The examples you showed here really should be add to https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for-await...of#examples

qiulang commented 3 years ago

@Jamesernator after thinking though your first example I still have some question:

const response = await fetch("/messageList");
// Once we have run passing this await the fetch request has resolved, in the current implementation
// I will get the all data in response.body, I need to wait to see an implementation 
// to return response.body as stream to take advantage of your linesFromStream
// Am I right ?
for await (const line of linesFromStream(response.body)) {
...
}

And that is actually my question too, I have yet to see any implementation does that. I actually asked another question at SO https://stackoverflow.com/questions/64021230/practical-example-of-nodejs-10-asynchronous-iteration-e-g-what-kind-of-problem , no one answers that one too.

Sorry this question has bugged for a long time 🤯

Jamesernator commented 3 years ago

// Once we have run passing this await the fetch request has resolved, in the current implementation // I will get the all data in response.body, I need to wait to see an implementation

Ah no, all browsers already implement ReadableStream, but the current interface for using it is to manually create a reader e.g.:

const response = await fetch("/messageList");

const reader = response.body.getReader();

while (true) {
    const { done, value } = await reader.read();
    if (done) {
        break;
    }
    const chunk = value;
    // Do stuff with chunk here
}

This gets awkward quickly, because if you break out of the loop before done is true, then the stream may keep buffering data until the whole response is received, although it might be garbage collectable if you haven't kept a reference to it, the exact semantics are very loose.

And even then the garbage collection thing only applies to fetch streams specifically, other specifications might not bother with cleanup on garbage collection, and ReadableStream can be created by users directly using new ReadableStream(myInit) as well. If you don't explictly call stream.cancel() then the stream will just keep running until it finishes (or forever if it's a infinite stream).

Basically what for-of (and by extension for-await-of), is they get an iterator (or async iterator) from the object, repeatedly call .next() on it until it returns { done: true }. If the loop is broken with break, return, or throw it'll call the kinda confusingly named .return function on the iterator.

In the case of ReadableStream, they basically added the [Symbol.asyncIterator] method to the spec such that it behaves basically like this:

ReadableStream[Symbol.asyncIterator] = function() {
    const reader = this.getReader();
    return {
        next() {
            return reader.read();
        },
        return() {
            return this.cancel();
        },
    };
};

This is what allows it to be used in a for-await-of loop (and Symbol.iterator is what allows Array, Set, etc to be used in a regular for-of loop). And yeah other than a couple of extra bits, that's pretty much the entire thing they added to ReadableStream, although it still hasn't been implemented in any browser yet.