Open RHarryH opened 3 years ago
This is what I do. Encrypt your service account json file (using gpg, pass, etc.). Then, when we want to use gdrive, we can decrypt the file and pipe it into gdrive which will read it from its standard input using /dev/stdin. For example:
gpg -d ~/.gdrive/account.json.enc | gdrive -c / --service-account /dev/stdin list
It'd be nicer if there was a --service-acount -
to symbolize read from standard input, but let's make do for now.
Note that this is still not 100% safe since anyone who has access to your /proc can attempt to read it from the stdin fd in realtime, but it will be encrypted at rest.
This is slightly nicer than the solution mentioned in #210 since it works with any keychain program that can output to stdout.
I played with it a while and it works nice. However gpg
requires passphrase to decrypt the file, which breaks automation of backup process. I could add it in command line or keep in file but I feel I'm getting back to the starting point with this solution.
You said you use gpg
for your purposes. Could you please share what is your solution to the case mentioned above?
I only use gdrive for ad-hoc uploads and downloads, so this isn't an issue for me.
However, for my automated backups, I use a key management service (KMS). You can use AWS's for free as long as you don't go overboard. Store your service account json with the KMS. The KMS will provide a key to access the file. Hand that key to the remote server. When it's time for the backup, your machine will use the key to ask the KMS for the service account json file, and use it to run gdrive. In the event that your remote server gets compromised, you can revoke that key's access to the json file in the KMS. Since the plaintext json file was never stored on the server, the attacker won't have access to your google account.
If you don't fully trust Amazon, you can also encrypt it once before uploading it to the KMS and hand that key to the server as well, so it will do an additional layer of decryption upon receiving the file from the KMS.
Another more manual method is to set up a gpg-agent (or any other keyrying manager) and give it a key cache store time of 7 days (or 30 days). Then, every week (or month) you can log on to your machine and type in your password to reset the key cache's store time for another 7/30 days. This way, if you don't explicitly log in to type in your password, the machine won't have access to the json file for longer than 7/30 days.
You can even combine the two methods if you want I suppose.
EDIT:. Two things I should mention though.
Hi all,
What if I'd like to use gdrive program to send backups of remote server? If my server will be compromised the one who break in have potential full access to my gdrive account. Is there any way to limit gdrive functionality to only one folder or some other thicks which make me feel safe?