Closed Arthurvdv closed 10 months ago
Indeed that picture is very helpfull. But because the lakehouse tables are delta tables PowerBI is using Direct Lake option. So no dataset is created.
Lakehouse is more about unstructured data. The BC data is more less structured. What you see is more that the unstructured data is moved to an Warehouse were everything is stored in a structural way. In this case you can then define the flowfields in tables that you want and is best for your reporting. Like dimensions.
That last step is more customization because every company want to bring is own data and combine maybe a tabel from multiple tables (sources) from the lakehouse.
Thank you for elaborating on this, that's very helpful! My first thought was on placing the Lakehouse in the center, but it makes sense (in my use case) to have a Warehouse in the center and move the data into there.
Are there plans to include a template in the project like there's for Azure Synapse? This could help lower the learning curve to get started with Microsoft Fabric. Or is this more suitable for a (online) course to be held in the future?
If you want to use flowfields then a warehouse is better. Well it is difficult to create a template. Because mostly a warehouse is customization. But an example is possible indeed. Then a user can get that example and move further. That could be an idea.
Could it be helpful to include in the documentation on this project on what the best approach is on handling Flow Fields from Business Central to OneLake of Microsoft Fabric combined with Direct Lake?
For example when export the item entity from Business Central and want to include the Inventory as a column.
The first step will be to also expose the "Item Ledger Entry" entity with the Quantity column. Form there, is this a Measure to include for every report over-and-over again? Or use a DataFlow to create a second Lakehouse to do the calculation/transformation? Or is there another (better) way of archiving this?