Closed acjdekorte closed 2 months ago
Hi @acjdekorte I can see your point to process them file by file. But what I do in the Fabric way is to merge everything in one dataframe and sort it. So in that way you don't have to sort and process the files individually.
Hi @Bertverbeek4PS Thank you for the suggestion. Indeed I could also sorting after importing.
We would like to process our delta files in time order in the azure data factory. Unfortunately 'Get Metadata' functionality of azure data lake can only retrieve array of childitems with the properties name and type. In order to sort them, files based on datetime, we would need to fetch the files individually to gets metadata, do a lot of processing. Given the fact, that array is sorted on filename, that would be an easier solution and probably also faster, since every call to azure data lake seems to take several seconds.
Is it acceptable to make a setting in bc connector to send the datetime as filename instead of guid. Or do you see a better solution to our problem.
The change for filename could in Codeunit 82562 "ADLSE Communication" - CreateDatablob().
If acceptable, I will make a pullrequest for this.