tachtler / dovecot-backup

Dovecot backup shell script for saving emails for every mailbox to its own tar.gz file.
GNU General Public License v3.0
56 stars 20 forks source link

backup file expiration #1

Closed detrout closed 7 years ago

detrout commented 7 years ago

Hello,

If I'm reading:

(ls $users-$FILE_DELETE -t|head -n $DAYS_DELETE;ls $users-$FILE_DELETE )|sort|uniq -u|xargs rm

correctly I think run the backup script something other than once per day, it'll delete the wrong number of backup files. Unless you mean "DAYS_DELETE" to really mean number of files to keep, which would could also be a reasonable policy.

Perhaps something similar to find <dir> -name "$users-$FILE_DELETE" -a ctime +${DAYS_DELETE} might be a better test?

tachtler commented 7 years ago

Hi,

yes, you're right. The name of the variable is not the best choice. It should be BACKUPFILE_DELETE. I will change that, the next time, I edit the code.

And I will check your solution with the find command as well.

Thank you! Klaus.

Am 27. März 2017 21:22:59 MESZ schrieb Diane Trout notifications@github.com:

Hello,

If I'm reading:

(ls $users-$FILE_DELETE -t|head -n $DAYS_DELETE;ls $users-$FILE_DELETE )|sort|uniq -u|xargs rm

correctly I think run the backup script something other than once per day, it'll delete the wrong number of backup files. Unless you mean "DAYS_DELETE" to really mean number of files to keep, which would could also be a reasonable policy.

Perhaps something similar to find <dir> -name "$users-$FILE_DELETE" -a ctime +${DAYS_DELETE} might be a better test?

-- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/tachtler/dovecot-backup/issues/1

-- Diese Nachricht wurde von meinem Android-Mobiltelefon mit K-9 Mail gesendet.

--


e-Mail : klaus@tachtler.net Homepage: http://www.tachtler.net DokuWiki: http://www.dokuwiki.tachtler.net

detrout commented 7 years ago

You're welcome.

And thank you for making a script that was remarkably easy to read.

tachtler commented 7 years ago

Hi,

using the -ctime inside a find comand is a good idea, BUT when restoring data from a external file system or if you move files from one backup system to another (not within the script) you have to think always, that you correct manually the ctime (The ctime of a file is updated when any of the metadata is changed.):

manpage of find:

-ctime n
       File's status was last changed n*24 hours ago.  See the comments
       for -atime to understand how rounding affects the interpretation
       of file status change times.

So for the issue, that the ctime could changed and all the files have a newer ctime, this could be dangarous, that the script will delete all files.

So I think it was better NOT to use the find comand in combination with the -ctime.

BUT, I will change the name of the variable, for better understanding, what the delete step for older files really does!

Thank you, and please tell me what do you think about the ctime problem I described! Klaus.

tachtler commented 7 years ago

Hi,

if there is no further comment, I will close this issue now.

Thank you for your contribution! Klaus.

detrout commented 7 years ago

I agree with your comment about the file times, as those do get modified when copying a file around. I was then thinking if there was a shell script way to parse the timestamp, but I didn't manage to come up with a solution.