Azure / azure-storage-python

Microsoft Azure Storage Library for Python
https://azure-storage.readthedocs.io
MIT License
338 stars 240 forks source link

Retry policy did not allow for a retry #562

Closed mallik3006 closed 5 years ago

mallik3006 commented 5 years ago

Which service(blob, file, queue) does this issue concern?

Blob

Which version of the SDK was used? Please provide the output of pip freeze.

azure-storage-blob==1.5.0 azure-storage-common==1.4.0 azure-storage-file==1.4.0 azure-storage-queue==1.4.0

What problem was encountered?

I'm getting retry did not allow issue:

Client-Request-ID=3bfd0839-4ebb-11e9-a920-080027deca13 Retry policy did not allow for a retry: Server-Timestamp=Mon, 25 Mar 2019 05:02:46 GMT, Server-Request-ID=36a57b91-901e-0119-0dc7-e2a272000000, HTTP status code=404, Exception=The specified blob does not exist. ErrorCode: BlobNotFound<?xml version="1.0" encoding="utf-8"?><Error><Code>BlobNotFound</Code><Message>The specified blob does not exist.RequestId:36a57b91-901e-0119-0dc7-e2a272000000Time:2019-03-25T05:02:46.6337446Z</Message></Error>.
blob_file = os.path.basename(destination)
get_gen = block_blob_service.list_blobs(container)
for blob in get_gen:
      if os.path.basename(blob.name) == blob_file:
           block_blob_service.get_blob_to_path(container, blob, download_path)

Appears like, 2nd request to get the blob is giving the error. Only one operation (here list_blobs) at a time in a request is being allowed. File size isn't an issue and the file exists. I can download the same file if I use only one method at a time i.e. get_to_path for download in this case without listing the blobs. I need to search for the blobs and then download for which I need both operations working simultaneously.

I tried adding the Linear and exponential retries but it did not solve the issue. Please could you provide a resolution?

Have you found a mitigation/solution?

No

Note: for table service, please post the issue here instead: https://github.com/Azure/azure-cosmosdb-python.

mallik3006 commented 5 years ago

I figured the issue, closing this.