Open theazgra opened 2 months ago
@cjpillsbury I will tag you directly, hope that is ok.
After looking at it more, the problem seems to be with nextChunkRangeStart
property. Which is not reset correctly after fallback is applied.
nextChunkRangeStart
is updated in successfulChunkUploadCb
, then error occurs and readableStreamErrorCallback
is called, nextChunkRangeStart
is not reset-ed to 0.
This leads to nextChunkRangeStart
being different in ChunkedFileIterable
instance and UpChunk
instance
ResettingnextChunkRangeStart
in readableStreamErrorCallback
seems to fix the problem.
if (options.useLargeFileWorkaround) {
const readableStreamErrorCallback = (event: CustomEvent) => {
// In this case, assume the error is a result of file reading via ReadableStream.
// Retry using ChunkedFileIterable, which reads the file into memory instead
// of a stream.
if (this.chunkedIterable.error) {
this.nextChunkRangeStart = 0; // <-- reset nextChunkRangeStart
console.warn(
`Unable to read file of size ${this.file.size} bytes via a ReadableStream. Falling back to in-memory FileReader!`
);
event.stopImmediatePropagation();
// Re-set everything up with the fallback iterable and corresponding
// iterator
this.chunkedIterable = new ChunkedFileIterable(this.file, {
...options,
defaultChunkSize: options.chunkSize,
});
this.chunkedIterator = this.chunkedIterable[Symbol.asyncIterator]();
this.getEndpoint().then(() => {
this.sendChunks();
});
this.off('error', readableStreamErrorCallback);
}
};
this.on('error', readableStreamErrorCallback);
}
Hey @theazgra sorry for the delayed response! That looks right to me and all makes sense. Feel free to issue a pull request if you wanted to contribute!
Hi,
this could be considered as follow-up to #134 , which was resolved by #138 .
We have discovered new bug, affecting same users, so Safari users. We have detected the error while uploading file of size cca 1.5GB.
The implemented failover is used, as this messege is logged to the console
Then the upload continues, failing with the last chunk. That is, because the Content-Range header is invalid. The upper limit of the range is larger than file size.
Headers of the last chunk upload:
Note that: 1509949439 >1503686620
HTTP error from Google API:
error handler is the correctly invoked atleast :)
Interestingly, second time uploading the same file, fixes the problem. (while using the same UpChunk instance, page is not reloaded)