Bertverbeek4PS / bc2adls

Exporting data from Dynamics 365 Business Central to Azure data lake storage or MS Fabric lakehouse
MIT License
60 stars 22 forks source link

Invalid format of GUID string. The correct format of the GUID string is: CDEF7890-ABCD-0123-1234-567890ABCDEF #166

Closed DeepAsmani closed 1 month ago

DeepAsmani commented 3 months ago

Hi @Bertverbeek4PS, I got error

"Invalid format of GUID string. The correct format of the GUID string is: CDEF7890-ABCD-0123-1234-567890ABCDEF where 0-9, A-F symbolizes hexadecimal digits."ADLSE Communication"(CodeUnit 82562).GetBaseUrl line 15 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Communication"(CodeUnit 82562).CheckEntity line 26 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).TryExportTableData line 12 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).OnRun(Trigger) line 48 - Azure Data Lake Storage Export by The bc2adls team"

image

Bertverbeek4PS commented 3 months ago

Hi, Could you change your workspace and lakehouse in the setup to the guid? Then it would work. But I will look into the code.

DeepAsmani commented 3 months ago

Thanks, Bert, for the speedy reply.

Could you please guide me on where I can find the GUID of the workspace and lakehouse? Also, could you explain how I can send all companies' data to Fabric?

Bertverbeek4PS commented 3 months ago

If you open the lakehouse you can see it in the URL both GUIDS: image

First is the workspace and second the lakehouse.

When you want to export it from multiple companies you have to create a job queue in each company.

DeepAsmani commented 3 months ago

Thanks Bert I will check and let you know

DeepAsmani commented 3 months ago

Hi Bert,

It works! Could you also tell me how to send multi-company data? I have 26 companies, and I want to consolidate all the GL entry data into a single dataset in Fabric.

Bertverbeek4PS commented 3 months ago

Ok great. If you want de export in all companies you need to create a Job Queue in each company that exports the data: https://github.com/Bertverbeek4PS/bc2adls/blob/main/.assets/FAQs.md#how-do-i-run-the-export-to-the-lake-in-a-recurring-schedule