Bertverbeek4PS / bc2adls

Exporting data from Dynamics 365 Business Central to Azure data lake storage or MS Fabric lakehouse
MIT License
49 stars 20 forks source link

On premise-storage instead of ADLS? #72

Closed cegekaJG closed 8 months ago

cegekaJG commented 8 months ago

Is it possible to aggregate the changes to the BC database on a local server instead? I'm looking for a way to provide changes (inserts, modifications and deletions) since a given timestamp to provide a third party service to interface with BC. The service currently works with Microsoft Finance & Operations using delta links, which unfortunately are deprecated on BC. That is why I need an intermediary service that keeps track of changes and only provides records that have been altered since the previous timestamp. It's important to note that these changes are requested every second and don't require changes from more than an hour ago, so the service creating and providing these deltas need to be fast, and can be cleaned up on a very frequent basis.

Is running the storage on a local server feasible, and are the writes fast enough to keep up with the frequent polling from the third party service?

Bertverbeek4PS commented 8 months ago

If it is an MS Cloud environment I doubt that you can get a request every second. Because that is really fast. In that case you can export to Azure File storage as updates. But that will take I think a minute. Because if a user edits a customer with multiple fields. With every field update you get the notification.

If it onpremise you can attach another database on SQL level (like the read only in BC online). https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/database-read-scale-out-configuration

Curious why they want it that fast.

cegekaJG commented 8 months ago

The requirements are a bit different - it isn't as much about the storage of data over a long timeframe, but a short list of changes to tables that have a lot of traffic - such as Item Journal Entry or Sales Line. Like I said, the timeframes between requests aren't long, and after longer periods such as breaks or a new shift the service can just request the entire table using the BC API. Since it's even possible to filter by the Modified At-field, it really only comes down to quickly finding out which, if any, records have been deleted since the last request.

Bertverbeek4PS commented 8 months ago

Well with webhooks you have a delay on 30 seconds. You can use Business Events and subscribe to the ondelete trigger. Or maybe on the global and specify which tables you want to trigger. Then in the business event you provide the SystemID, table Number and the action. In this way when a record is deleted the other party will get a notification.

cegekaJG commented 8 months ago

The service in question requires clientside polling only and has already declined adapting webhooks. There has to be an intermediary service that logs the changes, which is why I originally posted this issue.

Bertverbeek4PS commented 8 months ago

Then I will do the same as bc2adls. Register the deleted in a seperate table. Then the third party can poll all the changes from that table with an API.

Example code: https://github.com/Bertverbeek4PS/bc2adls/blob/main/businessCentral/app/src/Execution.Codeunit.al

Because if you put it on a local storage the changes then it will also cost you some time. You can then create a business event and store it somewhere. But that will cost you time.

cegekaJG commented 8 months ago

I talked to the client again and I was a bit mistaken - the polling doesn't happen every second, there's a minimum cooldown of 30 seconds between requests. Your example code definitely is a solution, but I'm wondering if I can avoid a decrease in performance by offloading the writing to the deletion log to a background session or external service. If the refresh rate is below 2Hz, it might be possible to use Webhooks after all.

Bertverbeek4PS commented 8 months ago

Then I would suggest maybe the Business events. Then you would not write anything to the database. And in my opinion are business events better then webhooks. Business events reacts directly.

cegekaJG commented 8 months ago

Business events require a Dataverse subscription, correct?

Bertverbeek4PS commented 8 months ago

Business Events doesn't require Dataverse. You can also do it without Dataverse: https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/developer/business-events-overview#query-business-central-catalog-of-business-events-in-non-dataverse-systems

And in this case when you subscribe to a Business Events it doesn't expire.

When you do it with the ondelete trigger I doubt if it is doable with a background session. Never done that part. But you can test that. But the table I would say make the PM an Integer with autoincrement = true. The third party must be polling then the API on that table each 30 seconds with an filter on the modified date or the rowversion.

cegekaJG commented 8 months ago

Thanks, I managed to make a custom business event using the global trigger. This will work for now.