Open aswinshenoy opened 4 years ago
One of the cause can be #14 due to memory limit caused due to flawed chunking algorithm.
One of the cause can be #14 due to memory limit caused due to flawed chunking algorithm.
The chunking algorithm on the sender's side seems to have been fixed (check #14 for stress test done by @MidhunSureshR) with the introduction of new chunking algorithm that only produces the next chunk and stores instead of creating and storing whole file at the beginning.
New algorithm
const slice = this._file.slice(
this._offset,
this._offset + this._chunkSize
);
return await new Blob([slice]).arrayBuffer().then((buffer) => buffer);
};
The following script is now obsolete and should be removed. https://github.com/aswinshenoy/BayJDO/blob/master/functions/getChunksFromFile.js
export default async function getChunksFromFile(file, chunkSize = 6 * 1024 * 1024) {
if(file) {
return await file.arrayBuffer().then((buffer) => {
let chunks = [];
while(buffer.byteLength) {
chunks.push(buffer.slice(0, chunkSize));
buffer = buffer.slice(chunkSize, buffer.byteLength);
}
return chunks;
});
}
return [];
};
Now there is another critical issue causing trouble in sending very large files as pointed out by @MidhunSureshR on the receiver's client, and a new issue has been opened #18.
Also #19 can be a potential issue faced while sending large files.
Chunking takes some while and fails while trying to send very large files.