Bertverbeek4PS / bc2adls

Exporting data from Dynamics 365 Business Central to Azure data lake storage or MS Fabric lakehouse
MIT License
43 stars 18 forks source link

How to trigger a full reset with multi-company setup #146

Closed rem-bou closed 4 weeks ago

rem-bou commented 1 month ago

Hello,

I was trying to trigger the creation of file in "reset" folder that is used in the "CopyBusinessCentral" notebook: folder_path_reset = '/lakehouse/default/Files/reset/'

%%pyspark
import os
import glob
from pyspark.sql.types import *

if os.path.exists(folder_path_reset):
    for filename in os.listdir(folder_path_reset):
        # Remove the table
        table_name = filename.replace("-","")
        table_name = table_name.replace(".txt","")

        df = spark.sql("DROP TABLE IF EXISTS "+ Lakehouse + "." + table_name)

        try:  
            os.remove(folder_path_reset + '/' + filename)  
        except OSError as e:  # this would catch any error when trying to delete the file  
            print(f"Error: {filename} : {e.strerror}")

I wasn't able to trigger the creation of the files from BC2ADLS by clearing logs, resetting tables, etc... How would you go about doing it?

As a side note, not all our company use the same BC feature currently so not all tables are exported for each company as there may not have any data in it.

Bertverbeek4PS commented 4 weeks ago

In the latest version there is an option "Delete option" image If that is enabled and you do a reset. Then there will be an file exported.

rem-bou commented 4 weeks ago

Thank you! I couldn't find the trigger....