Open ghost opened 8 years ago
This is a really nasty hack, but it does seem to work.
# diff -u Brive/backend.py Brive-v2/backend.py
--- Brive/backend.py 2016-05-06 08:22:39.626896994 -0400
+++ Brive-v2/backend.py 2016-05-06 08:22:39.615771904 -0400
@@ -67,7 +67,8 @@
return name
# UTC ISO-8601 time
- _UTC_TIME_PATTERN = r'%Y-%m-%dT%H%M%SZ'
+ # _UTC_TIME_PATTERN = r'%Y-%m-%dT%H%M%SZ'
+ _UTC_TIME_PATTERN = r'GD-BACKUPS'
@staticmethod
def _generate_session_name():
PR: #19 Should will make this possible.
I'm trying to figure out if this is a bad thing.
I have about 5000 Google accounts I'd like to download their own Google drive files.
If I do one run of all of the accounts, one bad file will crash the backup of the remaining accounts. Plus it will take days.
Right now I have a BASH script that allows me to backup all of the accounts (one at a time) but 15 accounts simultaneously. (I can actually max out our company's internet pipe this way.) However I then wind up with 5000 backup folders, each having only one account. I then wrote another script the rsyncs all of the folders into one "combined" folder.
If I were to add a command line parm to make the destination folder NOT add the timestamped subfolder, would this create issues? Since I'm using a ZFS filesystem with snapshotting, I don't really need the timestamped folder names.