Closed simongoldstone closed 3 years ago
It would be nice to have a set of options relating to timeouts/connection errors/retries, etc. in the same way Robocopy does.
Hey @simongoldstone ,
Thanks for reaching out and apologies for late response.
AzCopy job first scans the source and try to create batches of files that need to be transferred. The size of one batch of files is 10,000. This explains why AzCopy is transferring only 10K files. Once that batch finishes, AzCopy picks another batch for transfer. But be sure that we don't skip any file unless specifically specified.
If a particular file transfer fails, we retry it twenty times before we mark it as fail. If the issue is still persisting, please reach out to my email id with AzCopy logs and I'll investigate it for you.
Version: azcopy 10.7.0
OS: ubuntu desktop 20.04.1LTS
Command:
azcopy cp "https://[mystorageaccount].blob.core.windows.net/?sv=[SASKEYHERE]" "/path/to/dest" --recursive --overwrite ifSourceNewer
Error: INFO: failed to list blobs in container [mycontainer]: cannot list files due to reason read tcp [localip]->[remoteip]:443: read: connection reset by peer
Background: I'm trying to download the whole of the storage account to local disk for on-premise backup purposes. The command is only partially successful. Some blob containers are downloaded without issue. However, some blob containers only download the first 10000 items and then it moves on to other containers.
In my storage account I have approximately 20 containers, containing approx. 2-3TB of data in total. Some containers download perfectly (in excess of 10000 blobs) whereas others will not go past the 10000 limit. For example, azcopy downloaded 22,233 items from one container and only 10,0000items from the very next one.
Another problem is that it skips entire containers completely. When the command finishes, there are 3 containers not even attempted, i.e. there is no reference to them in my destination.
In an attempt to mitigate the issue, I have also tried to run the
azcopy cp
command just at container level (rather than the whole storage account) and it still refuses to download more than 10,000 items. I also receive the same error at container level as opposed to full storage account level.