Bakthat is a MIT licensed backup framework written in Python, it's both a command line tool and a Python module that helps you manage backups on Amazon S3/Glacier and OpenStack Swift. It automatically compress, encrypt (symmetric encryption) and upload your files.
I'm getting the following error message upon uploading a ~450GB file to Glacier. Previously I've successfully backed up a 150GB file, so I assume this has to do with file size. Any ideas on how to solve this?
File "/usr/local/bin/bakthat", line 9, in
load_entry_point('bakthat==0.6.0', 'console_scripts', 'bakthat')()
File "/usr/local/lib/python2.7/dist-packages/bakthat/init.py", line 649, in main
app.run()
File "/usr/local/lib/python2.7/dist-packages/aaargh/app.py", line 176, in run
return func(kwargs)
File "/usr/local/lib/python2.7/dist-packages/bakthat/init**.py", line 362, in backup
storage_backend.upload(stored_filename, outname, s3_reduced_redundancy=s3_reduced_redundancy)
File "/usr/local/lib/python2.7/dist-packages/bakthat/backends.py", line 186, in upload
archive_id = self.vault.concurrent_create_archive_from_file(filename, keyname)
File "/usr/local/lib/python2.7/dist-packages/boto/glacier/vault.py", line 266, in concurrent_create_archive_from_file
archive_id = uploader.upload(filename, description)
File "/usr/local/lib/python2.7/dist-packages/boto/glacier/concurrent.py", line 152, in upload
raise e
boto.glacier.exceptions.UploadArchiveError: An error occurred while uploading an archive: Expected 204, got (408, code=RequestTimeoutException, message=Request timed out.)
Hello,
I'm getting the following error message upon uploading a ~450GB file to Glacier. Previously I've successfully backed up a 150GB file, so I assume this has to do with file size. Any ideas on how to solve this?
File "/usr/local/bin/bakthat", line 9, in
load_entry_point('bakthat==0.6.0', 'console_scripts', 'bakthat')()
File "/usr/local/lib/python2.7/dist-packages/bakthat/init.py", line 649, in main
app.run()
File "/usr/local/lib/python2.7/dist-packages/aaargh/app.py", line 176, in run
return func(kwargs)
File "/usr/local/lib/python2.7/dist-packages/bakthat/init**.py", line 362, in backup
storage_backend.upload(stored_filename, outname, s3_reduced_redundancy=s3_reduced_redundancy)
File "/usr/local/lib/python2.7/dist-packages/bakthat/backends.py", line 186, in upload
archive_id = self.vault.concurrent_create_archive_from_file(filename, keyname)
File "/usr/local/lib/python2.7/dist-packages/boto/glacier/vault.py", line 266, in concurrent_create_archive_from_file
archive_id = uploader.upload(filename, description)
File "/usr/local/lib/python2.7/dist-packages/boto/glacier/concurrent.py", line 152, in upload
raise e
boto.glacier.exceptions.UploadArchiveError: An error occurred while uploading an archive: Expected 204, got (408, code=RequestTimeoutException, message=Request timed out.)