Closed coesensbert closed 3 months ago
after testing ended up with
compression: tar --use-compress-program="pigz -k --best --recursive | pv " -cf "/storage/rsync-public/tfchain-mainnet-$(date '+%Y-%m-%d').tar.gz" * | pv
decompression: tar -I pigz -xf tfchain-mainnet-best.tar.gz -C extract/ | pv
testing a stack deploy from snapshot to be sure no issues popped up. Will then implement for all nets and apply
Big difference in compression time, but archives are about 30% bigger. In total both creating and extracting saves a lot of time so let's go on with it.
implemented on all nets
compress a dir
tar --use-compress-program="pigz -k " -cf dir1.tar.gz dir1
uncompress
pigz -d dir1.tar.gz
ORunpigz dir1.tar.gz
ORtar -I pigz -xf /mnt/sd/current/backup/bigbackup_web.tar.gz -C /tmp
-> this will need adjusting in all backend stack install scripts AND the 'sync processor from 0' scripts before compressing the snapshots with pigz