Open wkoszek opened 9 years ago
@wkoszek attic's codebase currently can't cope in a good way with configurations that have little cpu/memory AND a big data set to backup.
The problem is that attic keeps the chunks (and files) index in memory and these grow with the amount of chunks (and files). As attic uses small chunks (approx. 64KB in the statistical middle), there will be a lot of chunks if you have a lot of data.
Adding swap space might solve the memory issue, but will make it even slower, so isn't a good solution. Adding memory (RAM) would really help.
In my repo, I implemented changes to make the chunker configurable, so it creates fewer chunks. Also you can use different compression, which might also help with the speed.
I run
attic
on Synology ds214play. It has 1GB of RAM. Build came fromhttps://attic-backup.org/downloads/releases/0.16/Attic-0.16-linux-i686.tar.gz
archive. Missing librarylibacl.so.1
came fromubuntu/trusty32
Vagrant image. My unit has 2x1TB disks working in mirroring. My/volume1/homes/wkoszek
has 876G of data. I have 1TB WD disk connected via USB (/volumeUSB2/usbshare
). I started from an empty USB disk:Attic seems very slow. It took 140hr to get:
of data archived from my NAS to the USB disk, and I don't think
attic
succeeded.After ~6 days Attic got an error:
I'd be interested in hearing whether people use Attic on archives of ~1TB size.