Closed j-heinze closed 1 year ago
Hi @j-heinze ,
I am keen to understand the necessity for this requirement and which use case it originates from. It should be fairly easy to create your own container with naming convention defining its purpose, say
landingzone-bc-production
landingzone-bc-sandbox1
landingzone-dataverse
landingzone-crm
Read the following to know more about naming conventions for containers, https://learn.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#container-names
Best regards, Dutta.
Hi @DuttaSoumya, I understand the thought behind this. It's just that we already have an existing datalake which is structured like the following:
This would allow us to integrate it in the existing structure. In general, why would it not make sense to fully specify a path for the export? Of course it would require an additional parameter for the pipelines and dataflow/notebook.
Best regards, Jan
Thanks @j-heinze. Indeed what you are looking for perhaps is to change the base url for all calls to the lake (you still need to create a separate container for each BC environment)
You may therefore make the change to the GetBaseUrl function.
As this is a fringe scenario and because of the workaround available above, the implementation could be done in your own repo.
Assuming that there are no further comments on this since last 9 days, will close issue.
Hello,
is it possible to add an optional directory path to further determine the export location for the data?
Currently only the container can be specified:
Thank you, Jan