Azure / azure-storage-python

Microsoft Azure Storage Library for Python
https://azure-storage.readthedocs.io
MIT License
339 stars 241 forks source link

IOError: close() called during concurrent operation on the same file object. #486

Closed nafabrar closed 6 years ago

nafabrar commented 6 years ago

Which service(blob, file, queue) does this issue concern?

Blob

Which version of the SDK was used? Please provide the output of pip freeze.

0.36.0

What problem was encountered?

IOError: close() called during concurrent operation on the same file object. Traceback (most recent call last): File "/home/amcpherson/production/gsc/tantalus/tantalus/backend/file_transfer_utils.py", line 420, in transfer_files f_transfer(file_instance, to_storage) File "/home/amcpherson/production/gsc/tantalus/tantalus/backend/file_transfer_utils.py", line 197, in upload_to_blob timeout=106064) File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/blockblobservice.py", line 411, in create_blob_from_path timeout=timeout) IOError: close() called during concurrent operation on the same file object.

Have you found a mitigation/solution?

No

Note: for table service, please post the issue here instead: https://github.com/Azure/azure-cosmosdb-python.

zezha-msft commented 6 years ago

Hi @nafabrar, thanks for reaching out and reporting this issue!

I noticed that you are using a very old version, 0.36.0, which is the monolithic SDK that is now deprecated. I would strongly suggest to upgrade to the latest azure-storage-blob with version 1.3.1.

Any how frequently do you see this error message? Did you operate on the file at the same time as uploading was happening?

nafabrar commented 6 years ago

Hi @zezha-msft ,

Thanks for the quick response! We often see the error like 1 out of 5 case. Also ,we did not operate on the file at the same time as uploading was happening.

zezha-msft commented 6 years ago

Hi @nafabrar, I apologize for the inconvenience.

Since the issue occurs so frequently, do you mind upgrading to the latest SDK and see if it still persists?

nafabrar commented 6 years ago

Hi @zezha-msft ,

Sure,we will update it and see if the problem still exists. Thank you.

zezha-msft commented 6 years ago

Hi @nafabrar, any update?

nafabrar commented 6 years ago

HI @zezha-msft , We are gettting this error now: Exception=ReadTimeout: HTTPSConnectionPool(host='singlecelldata.blob.core.windows.net', port=443): Read timed out. (read timeout=20). Traceback (most recent call last): File "/home/amcpherson/production/gsc/tantalus/tantalus/backend/task_scripts/transfer_files.py", line 8, in run_task(args['primary_key'], FileTransfer, transfer_files) File "/home/amcpherson/production/gsc/tantalus/tantalus/backend/task_scripts/utils.py", line 37, in run_task func(task_model, temp_directory) File "/home/amcpherson/production/gsc/tantalus/tantalus/backend/file_transfer_utils.py", line 416, in transfer_files f_transfer(file_instance, to_storage) File "/home/amcpherson/production/gsc/tantalus/tantalus/backend/file_transfer_utils.py", line 196, in upload_to_blob timeout=106064) File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/blockblobservice.py", line 463, in create_blob_from_path timeout=timeout) File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/blockblobservice.py", line 629, in create_blob_from_stream timeout=timeout, File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/_upload_chunking.py", line 144, in _upload_blob_substream_blocks range_ids = [uploader.process_substream_block(result) for result in uploader.get_substream_blocks()] File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/_upload_chunking.py", line 248, in process_substream_block return self._upload_substream_block_with_progress(block_data[0], block_data[1]) File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/_upload_chunking.py", line 251, in _upload_substream_block_with_progress range_id = self._upload_substream_block(block_id, block_stream) File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/_upload_chunking.py", line 283, in _upload_substream_block timeout=self.timeout, File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/blob/blockblobservice.py", line 1013, in _put_block self._perform_request(request) File "/home/amcpherson/production/gsc/tantalus/venv/lib/python2.7/site-packages/azure/storage/common/storageclient.py", line 381, in _perform_request raise ex azure.common.AzureException: ReadTimeout: HTTPSConnectionPool(host='singlecelldata.blob.core.windows.net', port=443): Read timed out. (read timeout=20)

zezha-msft commented 6 years ago

Hi @nafabrar, you are hitting the socket timeout, please increase its value here accordingly for your environment. We set a non-zero socket timeout so that your requests do not hang forever. The default value of the socket timeout is here. Its value is different for python 3.5+, and the explanation is in the comments.

It sounds like you are not seeing the IOError anymore, so I'll go ahead and close this issue. But if you encounter it again, please let me know and I can look into it further with your help.