Open ghost opened 7 years ago
You need to request for higher Limits on the Google API. Over the Google Drive API on the Google Cloud Platform.
Can I do it? Can you tell me how?
See #132
I just opened a pull request because I encountered the same problem and thus I just learnt Go for this purpose ;) @EverMineServer If you only need to use the download
or download query
command, then you should be able to use the code in my branch, but no promises until it's actually merged
I'm having the same problem, when "gdrive upload --recursive" a directory containing 2000 small files totally a measly 8 MB.
@jucor probably you can take a look at my PR #309 and try to apply the same changes for the upload. I'm not using this program any more (only needed it for exactly one occasion for downloading...) and currently don't have time to apply similar changes to the other parts of the program. At least the download did work for me then, when I was downloading over 2000 files.
Hi, Any updated solution for this?
I'm getting the same error while uploading just a single .gz file
I am having the exact same issue when trying to upload a single file. The rate which I am uploading in the past few days is like 20 MB / day, hard to believe I have reached a limit.
Interesting... It just works again, have touched nothing other than visiting the API dashboard to check what is going on. These computers nowadays....
Yes, this is what is happening, it is giving the error randomly. This needs to be fixed.
It happened to me and then I reran the program and everything was fixed.
It occurs randomly for me when I upload one large file (60mb)
gdrive: 2.1.0 Golang: go1.6 OS/Arch: linux/amd64
+1 rerun is ok
I don't thinks it's a size limit, I uploaded around 1 GB yesterday, and now I can not upload around 50 MB. PD. I'm in different Internet connection.
gdrive: 2.1.0 OS: linux (MINT)
For me, this error occured the first time I used gdrive. Rerunning it fixed everything. Note: I'm on a G Suite Account
I've tried in multiple connections but unsuccessful. it sucks...
Probaly somebody could make a auto restart feature
I encountered the same issue yesterday, used the --recursive option on all files within a folder. the empty folder was put on google drive but none of the files and I received this error. Google API had my rate at 1,000 calls per 100 seconds.
Does anyone know what is making it throw the error?
On my script retry loop, I catch the output and grep
for "Failed.*rateLimitExceeded" and after 10s it tries again until it works, usually doesnt take more than 3 retries :).
It happens with any file sizes apparently.
I think (expect) they may improve the servers to handle more satisfied users soon! xD
On my script retry loop, I catch the output and
grep
for "Failed.*rateLimitExceeded" and after 10s it tries again until it works, usually doesnt take more than 3 retries :). It happens with any file sizes apparently. I think (expect) they may improve the servers to handle more satisfied users soon! xD
if you do
while 1;do
./gdrive download {asdf} --recursive --skip
grep...|sleep etc
done
you are still re-calling the API for every single file, including the files that you've already downloaded, so it's still highly possible to get another 302 error before you come to the next file to download
my dirty fix: edit $GO_DIR/src/.../gdrive/drive/download.go at around line 245
for _, f := range files {
// Copy args and update changed fields
newArgs := args
newArgs.Path = newPath
newArgs.Id = f.Id
newArgs.Stdout = false
retry:
err = self.downloadRecursive(newArgs)
if err != nil {
print("retry after 5 seconds")
time.Sleep(5*time.Second)
goto retry
return err
}
}
dirty, but (kind of) works
still looking forward to a fix from repo owner suggestion: use API keys provided by users instead of an app for everyone
you know what, inspired by the wireless protocol, i came up with an idea sleep for random seconds! if everyone complies to this rule, there might be less possible for us to "collide" with each other, which would cause the "overspeeding" just kidding XD
upd: do this only if when you are in a rush check out @euklid 's fork for a stable fix
cool, i will try to run that script thx! btw, I never use --recursive, I only work with single files recreating each directory remotely as needed :)
It works when you run the command a second time.
Same problem here, but I don't use recursive and I am not downloading the files: I upload them through ~/gdrive sync upload --keep-largest
Any help is appreciated!
Same problem as @arjanna describes it on my side. However, looking at my sync logs back till 2017, I think this problem started occurring recently on my side. First log entry is from May, 2018 and it seems to come and go. I think this is somehow Google related. I found no explicit info about rate limits for user authorized tokens.
Same problem as @arjanna describes it on my side. However, looking at my sync logs back till 2017, I think this problem started occurring recently on my side. First log entry is from May, 2018 and it seems to come and go. I think this is somehow Google related. I found no explicit info about rate limits for user authorized tokens.
I am working around the issue and I found a solution here: https://github.com/gdrive-org/gdrive/issues/426 I hope it works for you @petarov ! This fixes the problem from my laptop, but I have to upload files from a supercomputer, where I don't have sudo permits, so I can't use this fix.
Interesting... It just works again, have touched nothing other than visiting the API dashboard to check what is going on. These computers nowadays....
This worked for me!!! 🎉 🎈 🎂 Definitely and odd validation condition on the API. 🤔
I gave up, I switched to grive2. Is actually the same.
@andresmitre I moved from grive2 to gdrive because to upload anything, even 1kb, I had to always download about 5MB (the whole remote file list with info to compare locally) and that would spend a lot of my daily quota, and there was absolutely nothing I could do to make it always one way (force upload only only local changes)
but I had to create my own big script to deal with everything :>
Is there a way for fix this error?
Failed to upload file: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded