Closed ssahoo-phdata closed 1 year ago
Update : Will be closing this bug as I got the cause for satellite not being able to filter out the duplicate records coming in subsequent runs . This is because the duplicate records coming in to source tables have distinct LOAD_DATETIME values , so the satellite considers the duplicates record to be distinct records and duplicates get inserted to satellites.
Describe the bug
Bug produced in DBT Cloud where the JOIN condition in records_to_insert clause for subsequent run in SAT table does not match the documentation produced code. Duplicate records coming from raw tables in the incremental runs are not filtered in the satellite.
Expected code in subsequent loads in SAT table macro
https://automate-dv.readthedocs.io/en/v0.9.4/macros/#example-output_3
Produced Code
Environment
dbt version: 1.4.6 automate_dv version: 0.9.5 Database/Platform: Snowflake
To Reproduce Steps to reproduce the behavior:
Expected behavior Expected code in subsequent loads in SAT table macro
https://automate-dv.readthedocs.io/en/v0.9.4/macros/#example-output_3
Screenshots If applicable, add screenshots to help explain your problem.
Log files If applicable, provide dbt log files which include the problem.
Additional context Add any other context about the problem here.