Bakthat is a MIT licensed backup framework written in Python, it's both a command line tool and a Python module that helps you manage backups on Amazon S3/Glacier and OpenStack Swift. It automatically compress, encrypt (symmetric encryption) and upload your files.
I created a 'test' glacier vault. Ran the following command:
bakthat configure
I was able to set everything up correctly with the config file. (aws creds, vault, region_name, etc)
I ran the following command. The zbxapi.py file is just a python file.
$ bakthat backup zbxapi.py
Backing up zbxapi.py
Password (blank to disable encryption):
Compressing...
Uploading...
Traceback (most recent call last):
File "/usr/bin/bakthat", line 9, in
load_entry_point('bakthat==0.6.0', 'console_scripts', 'bakthat')()
File "/usr/lib/python2.7/site-packages/bakthat/init.py", line 649, in main
app.run()
File "/usr/lib/python2.7/site-packages/aaargh/app.py", line 176, in run
return func(kwargs)
File "/usr/lib/python2.7/site-packages/bakthat/init**.py", line 362, in backup
storage_backend.upload(stored_filename, outname, s3_reduced_redundancy=s3_reduced_redundancy)
File "/usr/lib/python2.7/site-packages/bakthat/backends.py", line 186, in upload
archive_id = self.vault.concurrent_create_archive_from_file(filename, keyname)
File "/usr/lib/python2.7/site-packages/boto/glacier/vault.py", line 266, in concurrent_create_archive_from_file
archive_id = uploader.upload(filename, description)
File "/usr/lib/python2.7/site-packages/boto/glacier/concurrent.py", line 152, in upload
raise e
boto.glacier.exceptions.UploadArchiveError: An error occurred while uploading an archive: 'ascii' codec can't decode byte 0x8b in position 1: ordinal not in range(128)
Any help on this? I went into python debugger and fooled around but was unsure of where the file was getting placed into the Queue object. By the time I arrived there in the debugger the object was already loaded into a UnicodeDecodeError.
I have a pretty default setup.
I created a 'test' glacier vault. Ran the following command:
bakthat configure
I was able to set everything up correctly with the config file. (aws creds, vault, region_name, etc)
I ran the following command. The zbxapi.py file is just a python file. $ bakthat backup zbxapi.py Backing up zbxapi.py Password (blank to disable encryption): Compressing... Uploading... Traceback (most recent call last): File "/usr/bin/bakthat", line 9, in
load_entry_point('bakthat==0.6.0', 'console_scripts', 'bakthat')()
File "/usr/lib/python2.7/site-packages/bakthat/init.py", line 649, in main
app.run()
File "/usr/lib/python2.7/site-packages/aaargh/app.py", line 176, in run
return func(kwargs)
File "/usr/lib/python2.7/site-packages/bakthat/init**.py", line 362, in backup
storage_backend.upload(stored_filename, outname, s3_reduced_redundancy=s3_reduced_redundancy)
File "/usr/lib/python2.7/site-packages/bakthat/backends.py", line 186, in upload
archive_id = self.vault.concurrent_create_archive_from_file(filename, keyname)
File "/usr/lib/python2.7/site-packages/boto/glacier/vault.py", line 266, in concurrent_create_archive_from_file
archive_id = uploader.upload(filename, description)
File "/usr/lib/python2.7/site-packages/boto/glacier/concurrent.py", line 152, in upload
raise e
boto.glacier.exceptions.UploadArchiveError: An error occurred while uploading an archive: 'ascii' codec can't decode byte 0x8b in position 1: ordinal not in range(128)
Any help on this? I went into python debugger and fooled around but was unsure of where the file was getting placed into the Queue object. By the time I arrived there in the debugger the object was already loaded into a UnicodeDecodeError.