Bertverbeek4PS / bc2adls

Exporting data from Dynamics 365 Business Central to Azure data lake storage or MS Fabric lakehouse
MIT License
49 stars 18 forks source link

Cannot write more bytes to the buffer than the configured maximum buffer size: 2147483647 #11

Closed Arthurvdv closed 1 year ago

Arthurvdv commented 1 year ago

image Cannot write more bytes to the buffer than the configured maximum buffer size: 2147483647

image

When using Storage type set to Microsoft Fabric it seems there's a limit on the size of the file when it reaches 2 GB.

Does anybody know if this a hard limit or a setting somewhere in the tenant/workspace/lakehouse/... ?

In case this is a hard limit then I could mitigate this problem by creating a new file when the export exceeds the 2 GB. Or do we need to stop executing after 2 GB an let the a new run pickup where we left off on the previous run? Not sure if create multiple delta export files could impact the sequence/sorting when processing these when running the CopyBusinessCentral notebook?

Bertverbeek4PS commented 1 year ago

@Arthurvdv I have searched but it is an limit in the client size. So creating a new file after 2 GB is an good option. The notebook will pick up all the files and put it in an dataframe. So hoe many files are in the folder it doesn;t matter.

Arthurvdv commented 1 year ago

@Bertverbeek4PS, good to know that the notebook can handle multiple files.

I'll create a PR in the next couple of days for creating a new file before reaching the limit.