Closed 877dev closed 5 years ago
Thanks, Great work. I've pulled your modifications and started testing on my side. I've just done a bit of tidying up, set the file name to a variable. added a logfile.
now just looking at how to remove the old backups from the cloud
Thank you, I know it's not much but I learned a few things.
I realised since that if date & time were assigned to a variable it would be simpler, and the script would be able to write numerous backups on the same day should that be required:
DATE=$(date +"%Y-%m-%d_%H%M")
There's a page here I was looking at for managing backups on the cloud side, I'm still trying to understand it all but it might be of use: LINK UPDATED
Thanks :)
I went one step further and did this
logfile=./backups/log.txt
backupfile="backup-$(date +"%Y-%m-%d").tar.gz"
and used it like this
sudo tar -czf \
./backups/$backupfile \
...
Can you just add a log file something along these line? (just with your variable):
echo "backup saved to ./backups/$backupfile"
touch $logfile
echo $backupfile >>$logfile
Side note: if you use VS-code for shell scripts there is a really nice addon for formatting shell scritps (its based on shfmt written in go, i can send you the details)
Hi there, thank you for the suggestions, I have committed my latest changes. I hope this process is not too slow for you, but I am enjoying the challenge :)
I could not get echo $backupfile >>$logfile
to work for me : permission denied.
So I googled and settled on echo $backupfile | sudo tee $logfile -a
which works fine in my limited tests. It adds the new backup filename to the log.txt each time.
Which raises a question I've had for a few days, why is the backups folder 'root' access? As I am having to use sudo in the script a few times, not sure if that is bad.
Not sure about the next step, I guess you are thinking about using the log.txt file with some command to delete the remaining files. Then mimick this command with the dropbox/google drive uploads.
PS - I've only been using Atom editor until now, I just installed VS-code so yes anything usefuil you have would be appreciated.
Thanks!!
The issue with the rights on the backup folder are related to the contents of the volumes folder. Docker creates some of them with root privileges and therefore the tar command needs to be run with sudo which in turn creates the tar.gz as root.
"Fortunately" Rasbian has been configured not to ask for a password for sudo ... that alone gives me the creeps but it does allow you execute scripts like this.
I think the reason you had issues with the logfile was because you had somehow created it as root. I've modified the script slightly to remove the additional sudos by doing a chown on backup*
Great work, now just to get the cloud backups trimmed
Hi,
Hopefully this is of some use, if you have a better solution I would be interested.
I have done the following:
To do: Dropbox currently gets a copy of all files, look at how to remove older files if possible.