When publishing a product to a cloud resource, we often want to make the product as easy to download as possible. Having a cloud-provider specific url scheme for the download requires the client to install specific drivers for the cloud provider. That can be fine, but it is an additional burden. Cloud providers generally do provide raw https() end points which can be used by cloud oblivious tools such as wget.
It would be nice to have an option... something like:
retrieveRawHTTPs True|False.
If set to false, then the retrieval URL's are "azure://... "
if set to true, then the retrieval URL's are "https://... "
so when the option is set, the consumer of the messages can download using wget, or any browser (anything that can consumer https) rather than needing an Azure (or S3) specific client.
One thing that I'm vague on... in all other contexts we split the URL into baseUrl and the rest. I'm wondering whether the a gethttpsUrl() entry should return a tuple of baseUrl and the rest also, or the more natural complete Url.
If it returns the complete one, then the calling logic will need to split it up to obtain the baseUrl ...
I have thought of two ways of implementing this so far:
have the ls() command of the transfer, set an "httpsUrl" for each SFTPFileAttributes entry a cloud Transfer implementation creates. The caller digesting the ls() output, then uses the retrieveRawHttps setting to choose which fields are used to build the message.
not modify ls() at all, instead add getHttpsUrl() which, can, given an azure: url return the equivalent https one ( as a tuple, or perhaps just a relative path... ) when in this mode, processing listings will involve calling that new entry point for every file returned by ls()...
either method will work. Other suggestions welcome...
@gcglinton has agreed to work on implemeting the transformation to https. @petersilva can then use that to modify the internal sr3 logic.
When publishing a product to a cloud resource, we often want to make the product as easy to download as possible. Having a cloud-provider specific url scheme for the download requires the client to install specific drivers for the cloud provider. That can be fine, but it is an additional burden. Cloud providers generally do provide raw https() end points which can be used by cloud oblivious tools such as wget.
It would be nice to have an option... something like:
retrieveRawHTTPs True|False.
If set to false, then the retrieval URL's are "azure://... " if set to true, then the retrieval URL's are "https://... "
so when the option is set, the consumer of the messages can download using wget, or any browser (anything that can consumer https) rather than needing an Azure (or S3) specific client.
One thing that I'm vague on... in all other contexts we split the URL into baseUrl and the rest. I'm wondering whether the a gethttpsUrl() entry should return a tuple of baseUrl and the rest also, or the more natural complete Url. If it returns the complete one, then the calling logic will need to split it up to obtain the baseUrl ...
I have thought of two ways of implementing this so far:
have the ls() command of the transfer, set an "httpsUrl" for each SFTPFileAttributes entry a cloud Transfer implementation creates. The caller digesting the ls() output, then uses the retrieveRawHttps setting to choose which fields are used to build the message.
not modify ls() at all, instead add getHttpsUrl() which, can, given an azure: url return the equivalent https one ( as a tuple, or perhaps just a relative path... ) when in this mode, processing listings will involve calling that new entry point for every file returned by ls()...
either method will work. Other suggestions welcome...
@gcglinton has agreed to work on implemeting the transformation to https. @petersilva can then use that to modify the internal sr3 logic.
This would be great/useful for S3 and Azure.