leeroybrun / glacier-vault-remove

Remove all archives stored inside an Amazon Glacier vault, even if you have a huge number of them.
379 stars 50 forks source link

Error deleting vault with around 7m files #10

Closed rhamnett closed 7 years ago

rhamnett commented 9 years ago

Hi getting an error when removing a vault with lots of files (circa 7m)... any help much appreciated.

Traceback (most recent call last): File "./removeVault.py", line 92, in inventory = json.loads(job.get_output().read()) File "/usr/local/lib/python2.7/dist-packages/boto-2.38.0-py2.7.egg/boto/glacier/job.py", line 89, in get_output byte_range) File "/usr/local/lib/python2.7/dist-packages/boto-2.38.0-py2.7.egg/boto/glacier/layer1.py", line 740, in get_job_output response_headers=response_headers) File "/usr/local/lib/python2.7/dist-packages/boto-2.38.0-py2.7.egg/boto/glacier/layer1.py", line 116, in make_request return GlacierResponse(response, response_headers) File "/usr/local/lib/python2.7/dist-packages/boto-2.38.0-py2.7.egg/boto/glacier/response.py", line 41, in init body = json.loads(http_response.read().decode('utf-8')) File "/usr/local/lib/python2.7/dist-packages/boto-2.38.0-py2.7.egg/boto/connection.py", line 410, in read self._cached_response = http_client.HTTPResponse.read(self) File "/usr/lib/python2.7/httplib.py", line 551, in read s = self._safe_read(self.length) File "/usr/lib/python2.7/httplib.py", line 660, in _safe_read raise IncompleteRead(''.join(s), amt) httplib.IncompleteRead: IncompleteRead(155886763 bytes read, 2697175103 more expected)

rhamnett commented 9 years ago

I'm wondering whether it requires a chunk/multi-part download for such a big inventory?

Richard

leeroybrun commented 9 years ago

Hello,

Sorry for the delay, did you found a solution?

rhamnett commented 9 years ago

No still have the same issue unfortunately. Definitely think it's to do with a massive inventory and no multipart.

luminal-neal commented 8 years ago

I had a similar issue when running this script on a t2.micro. I believe I simply ran out of memory. Re-running on my laptop (w/ 16GB men) had no issue.

revjtanton commented 8 years ago

I'm having a similar error when deleting 2m archives, but I only get a Segmentation fault error and the process stops before it really begins. I'm running this on a VM though so I'll try increasing the memory and see what happens.

leeroybrun commented 7 years ago

I'm closing this. This seems to be related to low memory issues. Don't hesitate to let me know if someone with enough memory is encountering the same issue, and I will open it again.

AmazingDreams commented 7 years ago

I am trying to delete around 150.000 files, you are leaking an insane amount of memory during the deletion process. I watched 8GB VM fill up like crazy. Like 200MB added every second.