Closed Cherishty closed 6 years ago
Hi @Cherishty, thanks for reaching out!
I understand that you have a need for uploading/downloading entire directories, but I do not believe the SDK is the right place for that functionality. This is because the SDK is only supposed to be a thin wrapper over the REST API, so the user should be responsible for enumerating the directory and making the decisions about the behavior(ex: whether or not to follow symbolic links). Listing directories should be straightforward.
If you would like to use a tool to upload/download directories, please consider AzCopy, or the experimental AzCopy V2. Here is a guide.
Thanks for your clarification, Ze. I do agree on you that the SDK based on RESTAPI may not upload an entire directory directly, but should (any) one SDK has the responsibility to provide the interface to implement batch behavior? For this case that is batch upload and download.
AzCopy is cool I think, but it does not provide an interface to load file into memory (like get_file_to_text()), vice versa.
Anyway, since I get your clear response on this issue, I may implement batch action on my side, thanks!
Hi @Cherishty, I agree that a batch SDK could be useful to customers. We currently have this item on our backlog but unfortunately have no timeline yet. Thanks!
Which service(blob, file, queue) does this issue concern?
Azure File
Which version of the SDK was used? Please provide the output of
pip freeze
.azure-storage-common==1.3.0 azure-storage-file==1.3.1 azure-storage-nspkg==3.0.0
What problem was encountered?
Can python api for AzureFile support to upload a local directory to remote? downloading folder is the same requirement If not, why? I was using web_hdfs python api which support this behavior
Have you found a mitigation/solution?
If this function is not provided by api , we may recursively invoke create_file_from_path() by folder's hierarchy, but I believe it is too strange