Azure-Samples / azure-batch-samples

Azure Batch and HPC Code Samples
Other
260 stars 488 forks source link

upload hangs at initial connection #134

Closed clackner2007 closed 8 years ago

clackner2007 commented 8 years ago

i'm just testing uploading using blobxfer. I have no issue downloading data, but can't get the upload to work. I'm sure the SAS has write permission on the container, and I've also just used the storage account key to no effect. Here's what I'm running (foo.txt isn't empty).

$ blobxfer account container foo.txt --no-overwrite --numworkers 1 --saskey $BLOB_SAS_KEY --upload --no-createcontainer --no-computefilemd5
=======================================
 azure blobxfer parameters [v0.10.0]
=======================================
             platform: Darwin-15.4.0-x86_64-i386-64bit
   python interpreter: CPython 2.7.11
     package versions: az.common=1.1.1 az.sml=0.20.2 az.stor=0.30.0 crypt=1.3.1 req=2.9.1
      subscription id: None
      management cert: None
   transfer direction: local->Azure
       local resource: foo.txt
      include pattern: None
      remote resource: None
   max num of workers: 1
              timeout: None
      storage account: account
              use SAS: True
  upload as page blob: False
  auto vhd->page blob: False
            container: container
   blob container URI: https://account.blob.core.windows.net/container
     compute file MD5: False
    skip on MD5 match: True
   chunk size (bytes): 4194304
     create container: False
  keep mismatched MD5: False
     recursive if dir: True
component strip on up: 1
        remote delete: False
           collate to: disabled
      local overwrite: False
      encryption mode: disabled
         RSA key file: disabled
         RSA key type: disabled
=======================================

script start time: 2016-05-05 18:18:45

If I kill the job, it looks to be hung-up making the ssl connection: script start time: 2016-05-05 18:18:45 ^CTraceback (most recent call last): File "/Users/clackner/anaconda/envs/poc/bin/blobxfer", line 11, in sys.exit(main()) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/blobxfer.py", line 2118, in main blobskipdict = get_blob_listing(blob_service[0], args) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/blobxfer.py", line 1560, in get_blob_listing container_name=args.container, marker=marker, include=incl) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/blobxfer.py", line 1373, in azure_request return req(_args, _kwargs) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/blobxfer.py", line 538, in list_blobs requests.get, url=url, params=reqparams, timeout=self.timeout) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/blobxfer.py", line 1373, in azure_request return req(_args, _kwargs) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/api.py", line 67, in get return request('get', url, params=params, _kwargs) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/api.py", line 53, in request return session.request(method=method, url=url, _kwargs) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, _send_kwargs) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, _kwargs) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/adapters.py", line 376, in send timeout=timeout File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen body=body, headers=headers) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 376, in _make_request httplib_response = conn.getresponse(buffering=True) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/httplib.py", line 1136, in getresponse response.begin() File "/Users/clackner/anaconda/envs/poc/lib/python2.7/httplib.py", line 453, in begin version, status, reason = self._read_status() File "/Users/clackner/anaconda/envs/poc/lib/python2.7/httplib.py", line 409, in _read_status line = self.fp.readline(_MAXLINE + 1) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/socket.py", line 480, in readline data = self._sock.recv(self._rbufsize) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 180, in recv data = self.connection.recv(_args, *_kwargs) File "/Users/clackner/anaconda/envs/poc/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1319, in recv result = _lib.SSL_read(self._ssl, buf, bufsiz) KeyboardInterrupt

alfpark commented 8 years ago

Hello and sorry for the delay. Are you still facing this issue?

clackner2007 commented 8 years ago

yes, i'm still facing this issue.

alfpark commented 8 years ago

Can you try updating your system OpenSSL as well as pyopenssl? Additionally, please upgrade to 0.10.1 (with the updated dependencies - please ensure requests 2.10.0 and cryptography 1.4 are being used).

Also, how many blobs exist in the container?

clackner2007 commented 8 years ago

So, using the option -no-skiponmatch fixes the problem, in that I can then upload data to blob. If I don't use it, I still have the issue when I use the latest openSSL, as well as v0.10.1 of blobxfer. Here's a copy of the environment:

name: test
dependencies:
- cffi=1.6.0=py27_0
- enum34=1.1.6=py27_0
- idna=2.1=py27_0
- ipaddress=1.0.16=py27_0
- libffi=3.2.1=0
- mkl=11.3.3=0
- numpy=1.11.0=py27_1
- openssl=1.0.2h=1
- pip=8.1.2=py27_0
- pyasn1=0.1.9=py27_0
- pycparser=2.14=py27_1
- python=2.7.11=0
- readline=6.2=2
- setuptools=22.0.5=py27_0
- six=1.10.0=py27_0
- sqlite=3.13.0=0
- tk=8.5.18=0
- wheel=0.29.0=py27_0
- zlib=1.2.8=3
- pip:
  - azure-common==1.1.4
  - azure-nspkg==1.0.0
  - azure-servicemanagement-legacy==0.20.3
  - azure-storage==0.32.0
  - blobxfer==0.10.1
  - cryptography==1.4
  - futures==3.0.5
  - python-dateutil==2.5.3
  - requests==2.10.0

There are ~20 blobs in the container.

alfpark commented 8 years ago

Can you try using the raw shared account key and not SAS? I realize you tried that before just wanted to make sure with the new packages/version. If the error persists, then I'm not sure why the list blobs REST call is failing. I cannot repro this locally on my machine.

clackner2007 commented 8 years ago

Doesn't work with the raw key either. It does work with the --no-skiponmatch option, so I'll use that for now.

alfpark commented 8 years ago

Thanks for the info. Will close issue for now as it appears to be an underlying package or system issue.