mrworf / iceshelf

A simple tool to allow storage of signed, encrypted, incremental backups using Amazon's Glacier storage
GNU General Public License v2.0
32 stars 8 forks source link

Use AWS client instead of glacier-cmd #9

Closed mrworf closed 7 years ago

mrworf commented 7 years ago

The glacier-cmd is getting too poorly maintained and limiting. Iceshelf should make use of the AWS client which is up to date and allows us to support progress reports and better handling of resume if something fails. It's also a standard app in ubuntu which makes it easier to install.

Should solve current issues seen with 32GB archives.

mrworf commented 7 years ago

This will require treehash module to be installed instead (why reinvent the wheel). And while there is an SDK for AWS, I'd prefer to use a tool which limits the chances of providing the wrong data glacier (ie, invalid parameters, etc).

mrworf commented 7 years ago

Another benefit would be the potential for multithreaded upload, speeding up process. This however will only be implemented once the new solution is in-place and works.

wenliang commented 7 years ago

now is using awscli?

Thanks @mrworf

mrworf commented 7 years ago

@wenliang Yep, couldn't wait :-D

Doing testing right now and seems to be much more solid. But still seeing problems uploading to ireland from the US, so added retry delay and better logging to see why this is. But this seems to be an issue with ireland and the distance from US and when transferring large files (10GB+). Once this succeeds I'll close this issue.

mrworf commented 7 years ago

You can install awscli either via apt install awscli (worked on Ubuntu 16 LTS) or via pip install awscli

mrworf commented 7 years ago

Another reason to use awscli instead is that it's better maintained unlike glacier-cmd which seems to have stalled. Also, never needed treehash.

wenliang commented 7 years ago

Thanks.

I was having problem with glacier-cmd for error 408. I figure out adding --partsize 1 solving my problem. I was going to send pull request today but surprised to see you are using awscli already 😄

I am uploading using latest version now. seem pretty steady.

Thanks a lot! :+1:

mrworf commented 7 years ago

Latest commit resolves this issue. It now handles the limit of 40'000GB files and will scale partsize from 1MB to as much is necessary to stay within requirements. Also added support for multithreading during file upload, allowing much higher speeds on high latency connections.