Closed jimmywarting closed 5 years ago
I believe it would work something like this:
// chunkSize is optional
blobToIterator = function* (blob, chunkSize = 524288) {
let position = 0
while (true) {
const chunk = blob.slice(position, position + chunkSize)
const promise = util.blobToArrayBuffer(blob).then(ab => new Uint8Array(ab))
position += chunk.size
if (position >= blob.size) return promise
yield promise
}
}
not tested
My goal with blob-util is to keep things as simple as possible, but I would welcome you to write your own set of utilities to do streaming support. :)
Would be cool if this included stuff that helps you stream a blob. converting the hole blob into something else is not optimal when the blob is large
this is how you would get it in the most easiest way possible
And the way you would read a stream would be to use the
reader
fromstream.getReader()
But there is a cooler way on the horizon. Async iterator
This is thanks due to stream are getting a
Symbol.asyncIterator
soonI was just wondering if you could make something like the asyncIterator possible using only the FileReader api without having to depend on
Response.prototype.body
that isn't available in all browser yet since it's missingReadableStream
apiso something like this
this could also be written as:
Each time you call next you would have to return a promise resolving with a uint8array