Open jimmywarting opened 3 years ago
it made it easy to turn a stream into a blob/file without copying and require it to be immutable, the data is appended to a file instead and once it is done then it will create blob
Just thought of a way to make the data transferable without having to copy the data in order to create a blob.
const uint8 = new Uint8Array([97]) // will get detached and no longer be any usable.
const clone = new Uint8Array(uint8.buffer.transfer(), uint8.byteOffset, uint8.byteLength)
// Olè! We have a immutable uint8 array that we can make a blob out of.
// and they can't mutate the data any longer
it's a way safer option than letting ppl to be able to mutate the data, which they are likely not going to do anyway. I would like to have a transfer option much like how postMessage dose.
new Blob([ uint8 ], { ...opts }, [ uint8.buffer ])
new Blob([ uint8 ], {
...opts,
transfer: [ uint8.buffer ]
})
fetch-blob : version 3.2.0 q: Element implicitly has an 'any' type because type 'typeof globalThis' has no index signature.
in this video https://youtu.be/6EDaayYnw6M?t=1202 he talks about returning a blob from the fetch api. in theory you can return a blob early if you know the content-length or size of the blob. The content dose not have to be known immediately. you could for example make a request to a 4gb large file and have the blob returned just right after you get the http response without having all data at hand. That's it to say: the response has a content-length and isn't compressed.
This idea was brought up way long before in NodeJS by jasnell about 4y ago
I'm still interested in this idea also, but i have no ide/clue of how to sketch this up or how to best implment it.
I mean i built this HTTP File-like class that operates on byte-range, partial request and having a known content-length The goal of it all was to have a zip from a remote source, passing it to a zip parser that could slice and read the central directory so it could retrieve a list of all the files that was included and jump/seek within the blob for the stuff you needed without having to download the hole zip file. This meant that it would make multiple partial http request for each file later on it's a pretty cool concept of optimizing