Closed ffwjb closed 4 years ago
@ffwjb I'm not crazy then either! It's been bugging me, I keep hitting my limit as well, and it's been driving me crazy narrowing down what exactly is doing it. I'm trying to see what changes were made last.
I think it's Rclone, see this topic: https://forum.rclone.org/t/gdrive-error-403-the-download-quota-for-this-file-has-been-exceeded/18931
Hmmm....the settings for rclone are pretty shut and dry....not sure where this article is leading.
@ffwjb , are you using a client ID and secret for gdrive in rclone?
@bdschuster, Yes I'm using my own client ID and secret for rclone
@ffwjb , yeah, so am i....either way, still hitting "download" limit...the only thing i do see weirdly is that when i use my actual gsuites user for rclone, i don't hit my limit, but when using another gmail account that has my gdrive shared to it, i'm hitting the limit on those accounts. Maybe google changed the share download limit?
Ok, we're back from our trip to 1870 and back into the 21st Century, after 3 days without electricity 🤣
Did you guys change the Rclone service by any chance? Google is (used to be, I hope they still are) very lenient in using the resources, but rather strict on calls per second. That is why we implemented the lines
--tpslimit 10 \
--tpslimit-burst 10 \
in this file https://github.com/TechPerplexed/Gooby/blob/master/scripts/services/rclonefs.service
You could try to lower those to 6 or so and see if that makes a difference. I haven't experienced any bans... fingers crossed!
@TechPerplexed , welcome back! 🤣
I did not change any of those. I may try to tweak them, but why now is this issue coming up is what I can't get past 🤔
Thanks! Yeah, same here, as far as I know Rclone didn't change anything, and we definitely didn't. It's probably Google getting stricter. Unrelated perhaps but today I got an email that they're going to automatically empty the trash bin after 30 days, so they might have snuck in some more changes!
Yeah, not sure, but as I said, it seems like it's my gmail accounts that I have shared my drive to, and not my primary gsuites account that keeps hitting the download limit, and it's strange, cause it's hurting the download limit per file, and it's my entire library...so yeah, weird.
Fingers crossed, i've tweaked the setting to 8. I'll let you know what it does
@ffwjb , let me know if that helps!
I'm very curious about this too. Google allows a maximum of 10 TPS at any given time. Exceeding that leads to temporary download bans, which is what I think is happening here. If that is the case, Google may have indeed changed this to a lower limit, so setting it to 8 or even lower should theoretically stop the bans.
Of course I could be all wet and maybe it's something else altogether, but it's worth a try 😄
@TechPerplexed I guess we could say we are "perplexed"? 🤣🤣 sorry, had too 🤣
@bdschuster, @TechPerplexed So far so good for now, see the image the last ban I had was on thursday, friday with the adjustment to 8 no problems anymore.
@ffwjb awesome! are you using your primary gsuites user to authenticate in rclone?
Looks like those peaks have gone down, that might have done the trick. Let's hope that was it! And let's hope @bdschuster gets sorted too soon 👍
@bdschuster yes I'm using my primary account, I have only one account. In that account I have a few shared folders
@TechPerplexed we shall see. @ffwjb , I only have 1 gsuite user as well, but have the folders shared to my other regular gmail accounts for the other servers I have, those are the ones hitting the limit, not the primary gsuite account.
@TechPerplexed @bdschuster Unfortunaly it was to soon... a few hours ago out of the blue again a ban. I don't think it has something to do with de burstlimit. On the rclone forum, see link a few posts back, there are more and more people having the same problem.
Gah oh no, I guess Google really has stepped up on their limits then :( Looking at your charts though, it looks as if that ban only lasted seconds, am I right?
It looks that way, but there is a really thin line wich continues on the 100% level
Oh I see, yeah that didn't show up too clearly. Bummer!
I experienced quite frequent bans in the past, but ever since we went with setting Rclone up as a daemon that centrally controls the tpslimit, I haven't experienced any issues. Especially after I moved across to Emby which seems lighter on resources all around (Plex is a resource hog).
I wish I had some answers, but I'm as baffled as you are!
@bdschuster I think it might have something to do with the option to refresh the metadata every 3 days in plex. I had this option running, but last week i set this to not do this and so far so good I'm not banned the last 4 days.
When in doubt, blame Plex. Never failed me yet 🤣
@ffwjb, well I turned that option off, let's see what happens. Don't know why that option would cause an issue all of a sudden, unless my collection is just growing that much.
Set scheduler all to never Thumbnails also Skip intro also And add plexautoscan as service , use the drive_monitorring option
Last 7days ( sorry )
Google Drive API | 1.082.996 | 0 |
---|
And you will never hit limits
@doob187 can you explain the plexautoscan?
@doob187 can you explain the plexautoscan? @bdschuster https://github.com/l3uddz/plex_autoscan
@doob187 , this seems awesome! does that mean you don't have ANY periodic scans on? or maybe still like once a day?
drive monitorring is instant added to plex
upload file .. api see the file .. rclone read the file and added to plex .. / delay between upload and added on my side ~ 20-30 secs without perodic scans and api hammering
@bdschuster
and if you want it as docker
https://hub.docker.com/repository/docker/mrdoob/pascan or here https://hub.docker.com/r/sabrsorensen/alpine-plex_autoscan or fork it and create own docker ;-)
@doob187 , well, I have it installed and configured. Didn't have luck with the docker version for some reason. Appears my API hits have gone down. The only issue I'm having is with music, apparently it can't find it in Plex when trying to add them to Plex, but the directory matches and everything.
plex music dont work with plexautoscan
plex music dont work with plexautoscan
@doob187 , so i'm assuming just have Lidarr tell plex to scan like normal?
plex music dont work with plexautoscan
@doob187 , so i'm assuming just have Lidarr tell plex to scan like normal?
Yes well
This program truly did help with API hits, thanks for the recommendation @doob187 . I believe it's just cause we have such a huge library now @ffwjb , Plex fails to just scan individual files a lot of times, so it does a library scan which kills our API hits. Other than implimententing something like this program, i'm not sure there is much we can do about this. I'm going to close this out for now, but if we see a lot more complaints about this, we will revisit.
@bdschuster yes it's due to scanning of Plex with a large library, my first solution, turning off the 3 day scan did it at first, but last week again hit by ban. Did you managed to get it running with Gooby or did you set it up manually?
@ffwjb , it's a lot of setupish (the config file), but once you get it down, it's pretty simple. I didn't get the docker version to work, for some reason it would pull the docker image, but wouldn't create (i'm going to get back to it at some point). But now that's its working, my API hits are like nothing, and do not have Plex scanning at ALL anymore except for music as plex_autoscan doesn't work with it as @doob187 explained. Let me know if you need help setting it up manually.
@bdschuster yes please, any help with settings this up would be great. My setup is only based on Gooby.
@bdschuster where can I find the location "/usr/lib/plexmediaserver/lib"?
@ffwjb , sorry, missed this, you really only need to worry about 2 of them: "PLEX_DATABASE_PATH": "/var/local/Gooby/Plex/Library/Application Support/Plex Media Server/Plug-in Support/Databases/com.plexapp.plugins.library.db", "PLEX_SUPPORT_DIR": "/config/Library/Application\ Support",
@bdschuster sorry to bother you again, but I'm getting an error when I want to use Plex sections for locating the right media. Can you help me out with the settings below?
"PLEX_ANALYZE_DIRECTORY": true,
"PLEX_ANALYZE_TYPE": "basic",
"PLEX_FIX_MISMATCHED": false,
"PLEX_FIX_MISMATCHED_LANG": "en",
"PLEX_DATABASE_PATH": "/var/local/Gooby/Plex/Library/Application Support/Plex Media Server/Plug-in Support/Databases/com.plexapp.plugins.library.db",
"PLEX_EMPTY_TRASH": false,
"PLEX_EMPTY_TRASH_CONTROL_FILES": [
"/mnt/unionfs/mounted.bin"
],
"PLEX_EMPTY_TRASH_MAX_FILES": 100,
"PLEX_EMPTY_TRASH_ZERO_DELETED": false,
"PLEX_LD_LIBRARY_PATH": "/usr/lib/plexmediaserver/lib",
"PLEX_SCANNER": "/usr/lib/plexmediaserver/Plex\ Media\ Scanner",
"PLEX_SUPPORT_DIR": "/config/Library/Application\ Support",
"PLEX_USER": "",
"PLEX_TOKEN": "",
"PLEX_LOCAL_URL": "http://localhost:32400",
"PLEX_CHECK_BEFORE_SCAN": false,
"PLEX_WAIT_FOR_EXTERNAL_SCANNERS": true,
"RCLONE": {
"BINARY": "/usr/bin/rclone",
"CONFIG": "/.config/rclone/rclone.conf",
"CRYPT_MAPPINGS": {
},
"RC_CACHE_REFRESH": {
"ENABLED": false,
"FILE_EXISTS_TO_REMOTE_MAPPINGS": {
"Media/": [
"/mnt/google/"
]
},
"RC_URL": "http://localhost:5572"
}
},
"RUN_COMMAND_BEFORE_SCAN": "",
"RUN_COMMAND_AFTER_SCAN": "",
"SERVER_ALLOW_MANUAL_SCAN": false,
"SERVER_FILE_EXIST_PATH_MAPPINGS": {
"/mnt/unionfs/media/": [
"/data/"
]
},
"SERVER_IGNORE_LIST": [
"/.grab/",
".DS_Store",
"Thumbs.db"
],
"SERVER_IP": "0.0.0.0",
"SERVER_MAX_FILE_CHECKS": 10,
"SERVER_FILE_CHECK_DELAY": 60,
"SERVER_PASS": "9c4b81fe234e4d6eb9011cefe514d915",
"SERVER_PATH_MAPPINGS": {
"/mnt/unionfs/": [
"/home/seed/media/fused/"
@ffwjb Plex Sections are obsolete as it uses the path to find the library. But for the heck of it, what's the error?
@bdschuster when I run python scan.py sections, I get the error Plex Sections: bash: /var/local/Gooby/Plex/Library/Application: No such file or directory
@ffwjb , can you PM me your entire config please?
@bdschuster thank you for your help! I took the example file from l3uddz/plex_autoscan and I try to make it work on my server, no luck so far. { "DOCKER_NAME": "plex", "GOOGLE": { "ENABLED": false, "CLIENT_ID": "", "CLIENT_SECRET": "", "ALLOWED": { "FILE_PATHS": [
],
"FILE_EXTENSIONS": true,
"FILE_EXTENSIONS_LIST": [
"webm","mkv","flv","vob","ogv","ogg","drc","gif",
"gifv","mng","avi","mov","qt","wmv","yuv","rm",
"rmvb","asf","amv","mp4","m4p","m4v","mpg","mp2",
"mpeg","mpe","mpv","m2v","m4v","svi","3gp","3g2",
"mxf","roq","nsv","f4v","f4p","f4a","f4b","mp3",
"flac","ts"
],
"MIME_TYPES": true,
"MIME_TYPES_LIST": [
"video"
]
},
"TEAMDRIVE": false,
"TEAMDRIVES": [],
"POLL_INTERVAL": 60,
"SHOW_CACHE_LOGS": false
},
"PLEX_ANALYZE_DIRECTORY": true,
"PLEX_ANALYZE_TYPE": "basic",
"PLEX_FIX_MISMATCHED": false,
"PLEX_FIX_MISMATCHED_LANG": "en",
"PLEX_DATABASE_PATH": "/var/local/Gooby/Plex/Library/Application Support/Plex Media Server/Plug-in Support/Databases/com.plexapp.plugins.library.db",
"PLEX_EMPTY_TRASH": false,
"PLEX_EMPTY_TRASH_CONTROL_FILES": [
"/mnt/unionfs/mounted.bin"
],
"PLEX_EMPTY_TRASH_MAX_FILES": 100,
"PLEX_EMPTY_TRASH_ZERO_DELETED": false,
"PLEX_LD_LIBRARY_PATH": "/usr/lib/plexmediaserver/lib",
"PLEX_SCANNER": "/usr/lib/plexmediaserver/Plex\\ Media\\ Scanner",
"PLEX_SUPPORT_DIR": "/config/Library/Application\\ Support",
"PLEX_USER": "",
"PLEX_TOKEN": "",
"PLEX_LOCAL_URL": "http://localhost:32400",
"PLEX_CHECK_BEFORE_SCAN": false,
"PLEX_WAIT_FOR_EXTERNAL_SCANNERS": true,
"RCLONE": {
"BINARY": "/usr/bin/rclone",
"CONFIG": "/.config/rclone/rclone.conf",
"CRYPT_MAPPINGS": {
},
"RC_CACHE_REFRESH": {
"ENABLED": false,
"FILE_EXISTS_TO_REMOTE_MAPPINGS": {
"Media/": [
"/mnt/google/"
]
},
"RC_URL": "http://localhost:5572"
}
},
"RUN_COMMAND_BEFORE_SCAN": "",
"RUN_COMMAND_AFTER_SCAN": "",
"SERVER_ALLOW_MANUAL_SCAN": false,
"SERVER_FILE_EXIST_PATH_MAPPINGS": {
"/mnt/unionfs/media/": [
"/data/"
]
},
"SERVER_IGNORE_LIST": [
"/.grab/",
".DS_Store",
"Thumbs.db"
],
"SERVER_IP": "0.0.0.0",
"SERVER_MAX_FILE_CHECKS": 10,
"SERVER_FILE_CHECK_DELAY": 60,
"SERVER_PASS": "9c4b81fe234e4d6eb9011cefe514d915",
"SERVER_PATH_MAPPINGS": {
"/mnt/unionfs/": [
"/home/seed/media/fused/"
]
},
"SERVER_PORT": 3468,
"SERVER_SCAN_DELAY": 180,
"SERVER_SCAN_FOLDER_ON_FILE_EXISTS_EXHAUSTION": false,
"SERVER_SCAN_PRIORITIES": {
"1": [
"/Movies/"
],
"2": [
"/TV/"
]
},
"SERVER_USE_SQLITE": true,
"USE_DOCKER": true,
"DOCKER_NAME":"plex",
"USE_SUDO": false
}
Last two weeks i'm getting banned every few days by Google. I use my own api for Rclone.
any thoughts?