Bertverbeek4PS / bc2adls

Exporting data from Dynamics 365 Business Central to Azure data lake storage or MS Fabric lakehouse
MIT License
60 stars 22 forks source link

OutOfMemoryException #178

Open Prenamix opened 1 month ago

Prenamix commented 1 month ago

Hi,

For some reason I get this error message.

Exception of type 'System.OutOfMemoryException' was thrown. "ADLSE Http"(CodeUnit 82563).InvokeRestApi - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Gen 2 Util"(CodeUnit 82568).GetBlobContentLength line 16 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Communication"(CodeUnit 82562).FlushPayload line 35 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Communication"(CodeUnit 82562).CollectAndSendRecord line 18 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Communication"(CodeUnit 82562).TryCollectAndSendRecord line 6 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).ExportTableUpdates line 44 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).TryExportTableData line 14 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).OnRun(Trigger) line 48 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).ExportTableUpdates line 48 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).TryExportTableData line 14 - Azure Data Lake Storage Export by The bc2adls team\"ADLSE Execute"(CodeUnit 82561).OnRun(Trigger) line 48 - Azure Data Lake Storage Export by The bc2adls team\.

Does anyone know why?

This was by running an export of Sales Line with 26 fields selected. Less than 5 million rows.

Thanks.

Bertverbeek4PS commented 1 month ago

Hi @Prenamix never had that issue. Can you see in telemetry what the length of the payload it. There is ID ADLSE-013. For fabric there is a max of 2 gb for a blob. image You can maybe lower this.

What you also maybe can do is set the Max size of the payload to a lower number.

Fawalken commented 3 weeks ago

I have a client with some very large GLEntry and ValueEntry tables (30M+ rows) and I had to do this very carefully. On the initial sync, I had to disable row version sorting in the extension interface, do a full export of those 2 tables in each company, run the notebook to convert to delta, and then enabled row version sorting again once that was done. With row version sorting enabled, the out of memory exception would always occur.

Bertverbeek4PS commented 3 weeks ago

Thanks @Fawalken that is indeed a very good suggestion to turn it off. @Prenamix can you try that?