Closed panktiszluck closed 2 years ago
Hello, Resumable uploads are used for large files, as far as I understand, but unfortunately my package doesn't implement this functionality yet. I'll try adding it, but I can't say anything about the time frame, sorry.
A PR is welcome.
As for showing upload/download progress, I guess it's always on the client side. Good news is, resumable uploads provide the necessary info for it, as far as I remember.
Thanks for your response, will try to find some work around.
Hi, have you got parse error while uploading file?
Give some more details. You might want to check out my example project. The link is in the readme to this package.
Hi @RobinBobin , how can we use new method ResumableUploader
?
Hi, It's a class, not a method.
I'll try to find some time to add relevant code to my sample project.
In the meantime, please consult the readme and say what seems unclear.
Thanks.
I am not able to understand how to use it to upload large files.
On Sat, 3 Sep, 2022, 15:32 RobinBobin, @.***> wrote:
Hi, It's a class, not a method.
I'll try to find some time to add relevant code to my sample project.
In the meantime, please consult the readme and say what seems unclear.
Thanks.
— Reply to this email directly, view it on GitHub https://github.com/RobinBobin/react-native-google-drive-api-wrapper/issues/68#issuecomment-1236087941, or unsubscribe https://github.com/notifications/unsubscribe-auth/AXVJMWTBHOLV7YPZIDUTCG3V4MO33ANCNFSM5WLAE44Q . You are receiving this because you modified the open/close state.Message ID: <RobinBobin/react-native-google-drive-api-wrapper/issues/68/1236087941 @github.com>
I updated the sample project, adding 2 buttons (Resumable upload (multi)
/ Resumable upload (single)
) with relevant code snippets.
Great!, thanks, I will check and get back.
Hi, I tried with below code but getting error, I am getting the error for last chunk.
File to upload: https://research.nhm.org/pdfs/10840/10840-001.pdf
Code
const maxSize = 5 * 1024 * 1024; //256 * 1024;
var stats = await RNFetchBlob.fs.stat(localFilePath);
// console.log('stats', stats);
const fileSize = stats.size;
var chunkSize = maxSize;
var chunks = Math.ceil(fileSize / chunkSize);
var chunk = 0;
// eslint-disable-next-line no-restricted-syntax
console.log('file size..', fileSize);
// eslint-disable-next-line no-restricted-syntax
console.log('chunks...', chunks);
var fileName = filename + '.' + extension;
const uploader = await _gdrive.files
.newResumableUploader()
.setDataType(MimeTypes.PDF)
.setShouldUseMultipleRequests(true)
.setRequestBody({
name: fileName,
parents: [folderId]
})
.execute();
while (chunk <= chunks) {
var offset = chunk * chunkSize;
// eslint-disable-next-line no-restricted-syntax
console.log('current chunk..', chunk);
// eslint-disable-next-line no-restricted-syntax
console.log('offset...', chunk * chunkSize);
// eslint-disable-next-line no-restricted-syntax
console.log(
'file blob from offset...',
Math.min(offset + chunkSize, fileSize)
);
var isLast = false;
var _file = await RNFS.read(
localFilePath,
chunkSize,
isLast ? Math.min(offset + chunkSize, fileSize) : offset,
'base64'
);
// eslint-disable-next-line no-restricted-syntax
console.log('upload chunk', chunk, await uploader.uploadChunk(_file));
uploader.setContentLength(_file.length);
// eslint-disable-next-line no-restricted-syntax
console.log('upload status', await uploader.requestUploadStatus());
chunk++;
}
Error
{"__response":{"type":"default","status":400,"ok":false,"statusText":"","headers":{"map":{"alt-svc":"h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000,h3-Q050=\":443\"; ma=2592000,h3-Q046=\":443\"; ma=2592000,h3-Q043=\":443\"; ma=2592000,quic=\":443\"; ma=2592000; v=\"46,43\"","content-length":"37","content-type":"text/plain; charset=utf-8","server":"UploadServer","date":"Mon, 05 Sep 2022 11:11:17 GMT","x-guploader-uploadid":"ADPycdtcRfAIb_3-9SpIt_2WRTZYWImgZrBXy80_3Zl7w3OV9fFc_hPAYU9LULr3kXvT803skcMivbjlBi2ihvz7nNunvqQT-Fyj"}},"url":"https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable&upload_id=ADPycdtcRfAIb_3-9SpIt_2WRTZYWImgZrBXy80_3Zl7w3OV9fFc_hPAYU9LULr3kXvT803skcMivbjlBi2ihvz7nNunvqQT-Fyj","bodyUsed":true,"_bodyInit":{"_data":{"size":37,"offset":0,"blobId":"B8E7F734-F3B3-4D3A-A9CF-F0BDEB2BE970","type":"text/plain","name":"files.txt","__collector":{}}},"_bodyBlob":{"_data":{"size":37,"offset":0,"blobId":"B8E7F734-F3B3-4D3A-A9CF-F0BDEB2BE970","type":"text/plain","name":"files.txt","__collector":{}}}},"message":"Failed to parse Content-Range header.","__text":"Failed to parse Content-Range header."}
Am I missing something?
Create the uploader outside the loop. Invoke setContentLength()
outside the loop.
And try pasting the relevant code from the sample project into your project. When you get it working you'll have an idea of how things should work. It will be easier for you to add your own code then.
Okay, let me try, I just figured out to create the uploader outside the loop, and tried it. It created the file, but content is in base-64.
Great! Well, I guess you're sending it as base-64...
Yes, any idea how can I send actual data or something, as I want to upload pdf file. Also, when I placed setContentLength()
outsie the loop I am getting following error,
[HttpError: Invalid request. According to the Content-Range header, the upload offset is 20447232 byte(s), which exceeds already uploaded size of 13631488 byte(s).]
any idea how can I send actual data or something, as I want to upload pdf file.
Convert base64 to binary on your side and then upload binary.
when I placed setContentLength() outsie the loop I am getting following error, [HttpError: Invalid request. According to the Content-Range header, the upload offset is 20447232 byte(s), which exceeds already uploaded size of 13631488 byte(s).]
If you know the file size beforehand, invoke setContentLength()
before uploading any data.
If you don't know the size beforehand, upload all the data and invoke setContentLength()
after that.
I hope it'll work.
Hi, getting same error even after setting setContentLength()
before/after the content upload.
Can you upload random bytes, without reading your file? As my sample project does.
okay, let me try
yes, that works.
Great. Can you use the same logic with your actual data?
Tried, but getting above error.
Please paste your current relevant code, but without comments and console.log.
Hi, here is the exact code I am trying,
var chunkSize = 5 * 1024 * 1024;;
const uploader = await _gdrive.files
.newResumableUploader()
.setDataType(MimeTypes.PDF)
.setShouldUseMultipleRequests(true)
.setRequestBody({
name: fileName,
parents: [folderId]
})
.execute();
uploader.setContentLength(fileSize);
while (chunk <= chunks) {
var offset = chunk * chunkSize;
var isLast = false;
if (chunk === chunks) {
isLast = true;
}
var _file = await RNFS.read(
localFilePath,
chunkSize,
isLast ? Math.min(offset + chunkSize, fileSize) : offset,
'base64'
);
await uploader.uploadChunk(_file);
await uploader.requestUploadStatus();
chunk++;
}
Didn't I ask, WITHOUT comments and console.log? They distract attention.
Please, edit.
Also, please write, which line throws and paste the exception message.
I have updated the code snippet and not sure which line throws the exception, but, getting below message when trying to upload 2/3 chunk in the loop.
{"__response":{"type":"default","status":503,"ok":false,"statusText":"","headers":{"map":{"alt-svc":"h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000,h3-Q050=\":443\"; ma=2592000,h3-Q046=\":443\"; ma=2592000,h3-Q043=\":443\"; ma=2592000,quic=\":443\"; ma=2592000; v=\"46,43\"","content-length":"148","content-type":"text/plain; charset=utf-8","server":"UploadServer","date":"Tue, 06 Sep 2022 03:57:25 GMT","x-guploader-uploadid":"ADPycdsO0OuopFB2dIGpunp3rCTcW4ranEll4_lVKHuSh_8q-K6ApNCholxfr5CJPLcEbL2wE8FyOw0NKgm31FY32AK0Or-_mv2U"}},"url":"https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable&upload_id=ADPycdsO0OuopFB2dIGpunp3rCTcW4ranEll4_lVKHuSh_8q-K6ApNCholxfr5CJPLcEbL2wE8FyOw0NKgm31FY32AK0Or-_mv2U","bodyUsed":true,"_bodyInit":{"_data":{"size":148,"offset":0,"blobId":"28CF8A5F-DD2D-4FC7-A55A-BD1573CB5F4C","type":"text/plain","name":"files.txt","__collector":{}}},"_bodyBlob":{"_data":{"size":148,"offset":0,"blobId":"28CF8A5F-DD2D-4FC7-A55A-BD1573CB5F4C","type":"text/plain","name":"files.txt","__collector":{}}}},"message":"Invalid request. According to the Content-Range header, the upload offset is 786432 byte(s), which exceeds already uploaded size of 524288 byte(s).","__text":"Invalid request. According to the Content-Range header, the upload offset is 786432 byte(s), which exceeds already uploaded size of 524288 byte(s)."}
My assumption is there's a mess in the while
-condition. Don't you have one extra iteration?
while (chunk <= chunks) { var offset = chunk * chunkSize;
you need chunk
to start from zero, but you have <=
in the while
-condition.
Anyway, I can't see anything wrong in the way you're using my package. You can try removing my package relevant code, keep RNFS-logic and console.log offsets you get for each chunk. Comparing this to the file size you might get a hint.
Hi, may be, but I tried by removing =
from that condition as well, getting same result. Will look further.
Hi, How can I upload large files to google drive using this package? And, any way to show upload/download progress?
Thanks