blacklabelops / volumerize

Docker Volume Backups Multiple Backends
https://hub.docker.com/r/blacklabelops/volumerize/
MIT License
558 stars 77 forks source link

Dealing with duplicity cache lock (.lock file) #8

Closed darron1217 closed 6 years ago

darron1217 commented 7 years ago

@blacklabelops While I was testing volumerize in a small sized storage, duplicity backup process was interrupt because of storage limitation.

This left .lock file at /volumerize-cache directory, which causes the error below

Another instance is already running with this archive directory
If you are sure that this is the  only instance running you may delete
the following lockfile and run the command again :
/volumerize-cache/3fe07cc0f71075f95f411fb55ec60120/lockfile.lock

How about adding a cache-removing script before start backing up?

blacklabelops commented 7 years ago

I am not really sure how to solve this.

The cache is really important for me. I am using backblaze backend backups. This means if i delete the cache then each time i backup i need to build up another cache.

This mean additional costs for backups because data is transferred from and to my cloud storage.

Maybe we need to make this configurable?

darron1217 commented 7 years ago

@blacklabelops I mean removing lockfile.lock instead of removing whole cache (I actually didn't know the functionality of cache folder...haha)

Yep, cache is important but (I guess) lockfile (which is orphaned) is not needed on next backup :)

blacklabelops commented 7 years ago

Im still trying to figure out, how to decide, if a lockfile is really orphaned.

blacklabelops commented 6 years ago

Added new cleanup script cleanCacheLocks.