vigeeking / homeAutomation

My goal is to create a pipeline that is built exclusively with tools I either already know, or am only learning because they provide added value to the project
https://github.com/vigeeking/homeAutomation
0 stars 0 forks source link

Back up data to the cloud #6

Open vigeeking opened 4 years ago

vigeeking commented 4 years ago

I have a Raid 6 array where all of my critical data is stored. I have theoretical cold backups, but I don't trust them and they aren't that current. This task is done when I am comfortable with the backup status of all my media AND there is a specific folder I know I can put data in to have it ingested and backed up.

vigeeking commented 4 years ago

random thought: use two domains to back each other up?

vigeeking commented 4 years ago

Once again followed back up with this thread: https://www.reddit.com/r/DataHoarder/comments/598pb2/tutorial_how_to_make_an_encrypted_acd_backup_on/

Was able to recover my old keys and remount an old google share I had. Rclone has come a long way since I last played with it, and looks pretty set/forget for now. I just need to figure out how to mount the drive, the remotes are all set.

Next up, looking into fuze to mount the shares, and then I'll start the backup process.

vigeeking commented 4 years ago

Got the folder mounted using: rclone mount --daemon secret:/ /media/google

to mount the folder to /media/google on the vm. Note: There are a mix of encrypted and non encrypted files.

Now to look into backup solutions.

vigeeking commented 4 years ago

https://www.howtogeek.com/451262/how-to-use-rclone-to-back-up-to-google-drive-on-linux/ Seems pretty straight forward. Next step is to mount my local nfs share (time to actually make it nfs instead of samba/cifs?) on the media box and start uploading.

vigeeking commented 4 years ago

Backup commencing, sync seems to think about 2 weeks to transfer the remaining 1.6 tb of data.

vigeeking commented 4 years ago

Note: I initiated a one time back up, Before this task is done I need to make it a cron job.

vigeeking commented 4 years ago

ran into some network performance issues, modified script to throttle bandwidth during business hours to avoid potential work interruptions for Jen. New script:

/usr/bin/rclone copy --update --bwlimit "08:00,512 18:00,1M 23:00,off"--verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "/media/odin/" "secret:/"

Still need to cron job that.

vigeeking commented 4 years ago

Forgot to edit /etc/fstab. This has now been done.

vigeeking commented 4 years ago

Modified bandwidth limitations so that I could do more streaming during normal hours /usr/bin/rclone copy --update --bwlimit "08:00,1 18:00,1 23:00,off" --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "/media/odin/" "secret:/"

vigeeking commented 4 years ago

Time fixes all.