Closed Anon-Exploiter closed 4 years ago
I looked into https://github.com/Akianonymus/gdrive-downloader/issues/7 before creating this and it seems he ran into the same problem as I did but didn't provide any details if that URL was publicly accessible or from his own drive.
Confirmed, this only happens unauthenticatedly, can confirm through an incognito session as well, if the user is logged in, doesn't happen, and can download easily.
Also, I can confirm that when I'm authenticated as any user and not admin/member of that gdrive, I can download the file multiple times without any quota errors.
Happens only in incognito or unauthenticatedly, we might have to implement a check or something for shared drives. I've yet to test this all on personal drive, I'm sure there won't be any probs there.
Yeah, google has been doing this from a long time. Authenticated users have no limitations on download of public files/folders.
Applies to shared drive/non shared drive.
Now that you have confirmed that the issue is totally because of authentication, just have to implement authentication. Would be easy to do like google-drive-upload.
Again for this issue, will start working on it next week.
Great, sure!
Install test gdl
curl -Ls --compressed https://github.com/Akianonymus/gdrive-downloader/raw/master/install.sh | sh -s -- -b wip -c test_gdl
Use test_gdl
command to test.
Added -a / --auth flag to use authentication.
Read https://github.com/Akianonymus/gdrive-downloader/tree/wip#authentication for more info.
Awesome, works perfectly fine with 0 fails.
Thanks for working on this!
Unrelated question:
Have you taken a look at integrating API key rather than Oauth tokens for GDL? It's just that the process takes longer and you need a refresh token when you're using the saved config file after some time.
https://console.cloud.google.com/apis/credentials/
For the API key, just export it in environmental variables or we can take it as a input from user.
Hey, so due to some reason, it is only downloading 100 files from the folder. I created 1000 test files, uploaded them in a directory and it is downloading 100 from every directory.
You can use the shared drive link for testing further.
Authenticated:
test_gdl https://drive.google.com/open?id=1JTslaDrtoFNnOC9APuRxQXnLv55cN8ay -p 20 --auth
Unauthenticated:
test_gdl https://drive.google.com/open?id=1JTslaDrtoFNnOC9APuRxQXnLv55cN8ay -p 20
Files count:
Unrelated question:
Have you taken a look at integrating API key rather than Oauth tokens for GDL? It's just that the process takes longer and you need a refresh token when you're using the saved config file after some time.
https://console.cloud.google.com/apis/credentials/
For the API key, just export it in environmental variables or we can take it as a input from user.
Yeah, it should be simple enough, just have to add key=api_key
to url instead of access_token header.
E.g: curl -o filename "${API_URL}/drive/${API_VERSION}/files/${file_id}?alt=media&key=${API_KEY}"
If i remember correctly, api keys have less qouta for downloading files, have to recheck.
Hey, so due to some reason, it is only downloading 100 files from the folder. I created 1000 test files, uploaded them in a directory and it is downloading 100 from every directory.
You can use the shared drive link for testing further.
Authenticated:
test_gdl https://drive.google.com/open?id=1JTslaDrtoFNnOC9APuRxQXnLv55cN8ay -p 20 --auth
Unauthenticated:
test_gdl https://drive.google.com/open?id=1JTslaDrtoFNnOC9APuRxQXnLv55cN8ay -p 20
It's due to because you are doing too much parallel downloading, decrease that value and use retry flag to handle possible errors due to api limitation.
e.g: test_gdl 1JTslaDrtoFNnOC9APuRxQXnLv55cN8ay -p 100 -R 10
and superfast without any errors ( hopefully ) .
The thing is all the files download easily, but the total count it is showing is 100.
I downloaded another directory with subfolders, each having 50+ files in them and 4 total folders. That worked fine, maybe 100+ doesn't work?
Just to add here, the original directory has 1000 files.
Tested on two other directories with 176 and 376 files, it downloaded only 100 in them too.
Ooh, i misunderstood your problem. Seems like i have to handle it seperately if more than 100 files.
Yosh, man oh man I'm good at QA :joy:
So, i have added two new flags, -o/--oauth and -k/--key.
-o for oauth authentication and --key for using api key.
Providing api key is optional, if not provided, it will use in script api key.
See more info here: https://github.com/Akianonymus/gdrive-downloader/tree/wip#usage
To test, update the test_gdl
command with test_gdl -u
.
I have pushed fixes for 100 files problem, update and try that too.
Added option to save custom api key in config file.
Awesome, lemme try.
Man, awesome stuff, working perfectly fine!
Thanks for working on this :heart:
Hey,
Hope you're doing well. The issue Download quota exceeded comes on my shared drive whenever I'm trying to download.
I know it's self-explanatory but I'm downloading files from my own drive, just made it shareable and it can't even download the files (from a folder) once, a bug with the shared drive? Limitation from the gdrive side?
When I open those files in incognito, the same happens. But, when I try accessing those files or downloading them authenticated, it works perfectly fine.
Can we somehow integrate credentials or some tokens to make it work based on the account? Just like https://github.com/labbots/google-drive-upload?
Here's a breakdown of what I've observed: (Sometimes) the first file downloads perfectly fine from the folder, the rest of the files start giving limit quota exceeded even though they weren't even downloaded or accessed. Strange. Maybe, they blacklist IP? But after one download, doesn't make sense.
Let me know if you want to try this yourself and I'll create a fresh URL of files that haven't been downloaded ever and you can try and reproduce it yourself.
Here's a POC from a URL I created (for the very first time):
With this limitation, one can't ever download any files in bulk (even personal). Most probably we're overlooking something.