Closed andresebr closed 7 years ago
The script has no limits on number of files by itself. However Google API has different limits for each of its products. IIRC, it used to be 10,000 when I created this. You can check the current limit for your project in the Google Developers interface.
On Wed, Feb 11, 2015, 6:34 AM Andrés Barreto notifications@github.com wrote:
First of all I wanted to thank you for this awesome script, it works really well. I as wondering if there is a limit of files that can be downloaded using this script. I have tested it with a small number of files and it works pretty well but It doesn't download all files when I have a large number of them stored in the cloud. I'm not sure it this an SDK limitation though. I that's the case or not, is there a way to make this script to work with a large amount of files?
— Reply to this email directly or view it on GitHub https://github.com/vikynandha/google-drive-backup/issues/2.
Thanks for your quick response. I've solved the problem.
After doing some research I found out that the Drive API lists the files of a folder using page tokens (each one contains 100 files by default) so it is necessary to use the 'nextPageToken' flag and send its value as a parameter while listing the files inside a folder. Actually the script does not support this, so no matter how many items are in a folder, it will only get the 100 items listed on the first page. This means that the script just downloads up to 100 files per folder unless this changes are made.
Ah ok. Thanks! I'll try to fix the script to handle this. If you already have a solution, you're welcome send a pull request.
On Mon, Feb 16, 2015 at 12:01 PM, Andrés Barreto notifications@github.com wrote:
Thanks for your quick response. I've solved the problem.
After doing some research I found out that the Drive API lists the files of a folder using page tokens (each one contains 100 files by default) so it is necessary to use the 'nextPageToken' flag and send its value as a parameter while listing the files inside a folder. Actually the script does not support this, so no matter how much items are in a folder, it will only get the 100 items listed on the first page. This means that the script just downloads up to 100 files per folder unless this changes are made.
— Reply to this email directly or view it on GitHub https://github.com/vikynandha/google-drive-backup/issues/2#issuecomment-74465168 .
Vignesh Nandha Kumar http://viky.in
Done!
First of all I wanted to thank you for this awesome script, it works really well. I as wondering if there is a limit of files that can be downloaded using this script. I have tested it with a small number of files and it works pretty well but It doesn't download all files when I have a large number of them stored in the cloud. I'm not sure it this an API limitation though. I that's the case or not, is there a way to make this script to work with a large amount of files?
Thank you.