Open vigeeking opened 4 years ago
random thought: use two domains to back each other up?
Once again followed back up with this thread: https://www.reddit.com/r/DataHoarder/comments/598pb2/tutorial_how_to_make_an_encrypted_acd_backup_on/
Was able to recover my old keys and remount an old google share I had. Rclone has come a long way since I last played with it, and looks pretty set/forget for now. I just need to figure out how to mount the drive, the remotes are all set.
Next up, looking into fuze to mount the shares, and then I'll start the backup process.
Got the folder mounted using: rclone mount --daemon secret:/ /media/google
to mount the folder to /media/google on the vm. Note: There are a mix of encrypted and non encrypted files.
Now to look into backup solutions.
https://www.howtogeek.com/451262/how-to-use-rclone-to-back-up-to-google-drive-on-linux/ Seems pretty straight forward. Next step is to mount my local nfs share (time to actually make it nfs instead of samba/cifs?) on the media box and start uploading.
Backup commencing, sync seems to think about 2 weeks to transfer the remaining 1.6 tb of data.
Note: I initiated a one time back up, Before this task is done I need to make it a cron job.
ran into some network performance issues, modified script to throttle bandwidth during business hours to avoid potential work interruptions for Jen. New script:
/usr/bin/rclone copy --update --bwlimit "08:00,512 18:00,1M 23:00,off"--verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "/media/odin/" "secret:/"
Still need to cron job that.
Forgot to edit /etc/fstab. This has now been done.
Modified bandwidth limitations so that I could do more streaming during normal hours /usr/bin/rclone copy --update --bwlimit "08:00,1 18:00,1 23:00,off" --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "/media/odin/" "secret:/"
Time fixes all.
I have a Raid 6 array where all of my critical data is stored. I have theoretical cold backups, but I don't trust them and they aren't that current. This task is done when I am comfortable with the backup status of all my media AND there is a specific folder I know I can put data in to have it ingested and backed up.