fivetran / dbt_linkedin

Fivetran's Linkedin Ads dbt package
https://fivetran.github.io/dbt_linkedin/
Apache License 2.0
2 stars 12 forks source link

[Bug] constant expressions are not supported in partition by clauses in Redshift due to linkedin_ads_source.stg_linkedin_ads__creative_history #33

Closed yuna-tang closed 8 months ago

yuna-tang commented 11 months ago

Is there an existing issue for this?

Describe the issue

We are planning to upgrade to dbt 1.6 with Fivetran/ad_reporting package [">=1.7.0", "<1.8.0"].

However, we have encountered below Redshift error for the model linkedin_ads_source.stg_linkedin_ads__creative_history. This is the v0.8.0.

The issue is due to the source_relation is empty in CTE 'fields'.

When I remove source_relation from the partition, below CTE would work. Otherwise, it returns "constant expressions are not supported in partition by clauses" error.

cast('' as TEXT) as source_relation

row_number() over (partition by source_relation, id order by coalesce(last_modified_at, last_modified_time) desc) = 1 as is_latest_version

fields as (

    select
        source_relation,
        id as creative_id,
        campaign_id,
        coalesce(intended_status, status) as status,
        click_uri,
        cast(coalesce(last_modified_at, last_modified_time) as {{ dbt.type_timestamp() }}) as last_modified_at,
        cast(coalesce(created_at, created_time) as {{ dbt.type_timestamp() }}) as created_at,
        row_number() over (partition by source_relation, id order by coalesce(last_modified_at, last_modified_time) desc) = 1 as is_latest_version

    from macro

What would be the workaround solution for us to upgrade to Fivetran/ad_reporting package [">=1.7.0", "<1.8.0"] ?
Or if we could address the partition issues in the future release of linkedin_ads_source dbt package? Screenshot 2023-11-10 at 5 41 35 pm

Relevant error log or model output

00:17:19    Database Error in model stg_linkedin_ads__creative_history (models/stg_linkedin_ads__creative_history.sql)
  constant expressions are not supported in partition by clauses
  compiled Code at target/run/linkedin_source/models/stg_linkedin_ads__creative_history.sql

Expected behavior

There shouldn't be any database error in model stg_linkedin_ads__creative_history as previous version partition is not based on empty string.

The fix that I applied is by removing source_relation from the partition clause in stg_linkedin_ads__creative_history.sql

This is per the v7.0 version of [stg_linkedin_ads__creative_history.sql] (https://github.com/fivetran/dbt_linkedin_source/blob/v0.7.0/models/stg_linkedin_ads__creative_history.sql)

row_number() over (partition by id order by coalesce(last_modified_at, last_modified_time) desc) = 1 as is_latest_version

`with base as (

select * from "datamapping"."prod_linkedin_ads_source"."stg_linkedin_ads__creative_history_tmp"

), macro as (

select campaign_id as campaign_id, click_uri as click_uri, created_time as created_time, created_at as created_at, id as id, last_modified_time as last_modified_time, last_modified_at as last_modified_at, intended_status as intended_status, status as status, cast('' as TEXT) as source_relation from base ), fields as (

select source_relation, id as creative_id, campaign_id, coalesce(intended_status, status) as status, click_uri, cast(coalesce(last_modified_at, last_modified_time) as TIMESTAMP) as last_modified_at, cast(coalesce(created_at, created_time) as TIMESTAMP) as created_at,

-- below is the original code partition by ource_relation --row_number() over (partition by source_relation, id order by coalesce(last_modified_at, last_modified_time) desc) = 1 as is_latest_version

row_number() over (partition by id order by coalesce(last_modified_at, last_modified_time) desc) = 1 as is_latest_version

from macro) select * from fields;`

dbt Project configurations

vars:
  surrogate_key_treat_nulls_as_empty_strings: true #turn on legacy behavior
  'dbt_date:time_zone': 'UTC'

  salesforce_schema: fivetran_salesforce
  salesforce__order_enabled: False
  apple_search_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__microsoft_ads_enabled: True
  microsoft_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__linkedin_ads_enabled: True
  linkedin_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__google_ads_enabled: True
  google_ads_database: datamapping # manually added for serverless datashare
  #The Adwords API has since been deprecated by Google and is now no longer the standard API for the Google Ads connector.
  google_ads_schema: adwords_new_api
  ad_reporting__twitter_ads_enabled: True
  twitter_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__facebook_ads_enabled: True
  facebook_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__snapchat_ads_enabled: False
  ad_reporting__tiktok_ads_enabled: False
  ad_reporting__pinterest_ads_enabled: False
  ad_reporting__reddit_ads_enabled: False
  ad_reporting__amazon_ads_enabled: False
  #The api_source variable is now defaulted to google_ads as opposed to adwords.
  api_source: google_ads

Package versions

packages:

What database are you using dbt with?

redshift

dbt Version

dbt version: 1.6

Additional Context

No response

Are you willing to open a PR to help address this issue?

fivetran-catfritz commented 11 months ago

Hi @yuna-tang thank you for submitting this issue! I will be looking into this in the coming week and will let you know if I have any questions.

fivetran-catfritz commented 10 months ago

Hi @yuna-tang I am trying to reproduce this error in Redshift, however I am unable to do so maybe because we have limited data in our Redshift instance. Have you changed the materializations of your models to be views, by the way? I believe I have seen this before when all the outputs are views.

Either way I think we might need to update our partition by clause to remove source_relation if it's not needed, but I wanted to get a little more information first.

yuna-tang commented 10 months ago

Hi @fivetran-catfritz, we didn't change the materializations of our models. This model and upstream model are all generated via the fivetran/ad_reporting package. Below is the SQL statement to create the upstream table:

), `

fivetran-catfritz commented 10 months ago

@yuna-tang Thank you much for the extra information. I am taking this back to my team and will let you know what the plan is.

yuna-tang commented 10 months ago

@fivetran-catfritz Thank you for the updates. I'm looking forward to hearing the next step. My team is waiting for this package to be fixed before we can update to the dbt1.6 later this year.

fivetran-catfritz commented 10 months ago

Hi @yuna-tang would you be open to walking us through the issue you're seeing live so we can see your environment? We are still having a hard time recreating the issue but want to make sure our fix is appropriate. If so, please email solutions@fivetran.com with your availability.

yuna-tang commented 10 months ago

@fivetran-catfritz I am Australia Sydney based. Will any hours be Sydney time-friendly? I will email my availability to the solutions@fivetran.com

fivetran-catfritz commented 10 months ago

Hi @yuna-tang I am US Chicago time, so we have some overlap! Probably anytime before 10am Sydney time should work for us. I'm not available on Wednesday/your Thursday, but my schedule is open otherwise. If you send us your times, we can work it out from there.

fivetran-catfritz commented 10 months ago

@yuna-tang Also in the meantime before we meet, could you try using the below setting in your dbt_project.yml and seeing if that also resolves the issue? I don't think this is necessarily the final solution, but it might give us another clue as to what's going on with Redshift.

models:
  linkedin_source:
    tmp:
      +materialized: table
yuna-tang commented 10 months ago

@fivetran-catfritz Once I materialize it as a table. There is no error anymore. But we would like to have those sources as view, though.

  # materialize linkedin source to table to avoid issue constant expressions are not supported in partition by clauses
  # see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770)
  linkedin_source:
    tmp:
      +materialized: table
image image
fivetran-catfritz commented 10 months ago

Hi @yuna-tang thanks for meeting with me today and showing me the issue you're seeing! To summarize what we discussed, I agree we will need to make an update to our package, however it's hard to predict exactly when we will release the change. In the meantime, I will send you a test branch you can install that should fix your issue.

yuna-tang commented 10 months ago

Thank you @fivetran-catfritz for your time today. I am looking forward to the test branch fix and future package update.

fivetran-catfritz commented 10 months ago

Hi @yuna-tang I was able to get to the test branch today. This is an example of how I updated the source package. You can install it using the code snippet below in place of your current ad_reporting in your packages.yml. This might not be our final solution, but it should work in the meantime!

packages:
  - git: https://github.com/fivetran/dbt_ad_reporting.git
    revision: test/update-partition-by
    warn-unpinned: false

Let me know how it works!

yuna-tang commented 10 months ago

@fivetran-catfritz Thank you so much for updating the test branch so quickly. :) I just ran below command and it works locally. I will test it against our pipeline this week to see if it creates any other downstream models issues.

 Below is the tempoary fix for linkedin source to avoid error "constant expressions are not supported in partition by clauses"
 see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770)
   - git: https://github.com/fivetran/dbt_ad_reporting.git
     revision: test/update-partition-by
     warn-unpinned: false
  materialize linkedin source to table to avoid error "constant expressions are not supported in partition by clauses"
  see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770)
 linkedin_source:
 tmp:
  +materialized: table

dbt build --select +stg_linkedin_ads__creative_history

image

yuna-tang commented 10 months ago

@fivetran-catfritz I have tested against our pipeline, and there is no issue with our test package. Please let us know when the fix will be incorporated into the main ad_reporting package.

fivetran-catfritz commented 9 months ago

@yuna-tang That's great! Thank you for testing that out. Our team is planning to pick this back up in the early new year since they want to look at the issue from a few more perspectives. In the meantime, I recommend to continue using the test branch. We won't make any other changes to it, so it will be stable to continue using. The next update for this package should include it, so you also won't miss other updates to this package if you pin to the branch. I'll post back here when this gets picked up!

yuna-tang commented 9 months ago

@fivetran-catfritz Thank you for the confirmation. I will follow up next year to see the updates. For now, I will continue using the test branch. :)

fivetran-catfritz commented 9 months ago

Hi @yuna-tang Happy New Year! I have revisited this issue and believe I found a more robust solution. This time I updated the way our fivetran_utils macros handle the source_relation column instead of modifying the linkedin package directly. It is proving out on my end--would you be able to test it out on your data? For now I was only able to apply this update for the linkedin package. Let me know thanks!

packages:
  - git: https://github.com/fivetran/dbt_linkedin.git
    revision: test/redshift-constant-exp
    warn-unpinned: false
yuna-tang commented 9 months ago

@fivetran-catfritz Happy New Year and hope you had a nice break! Just to double confirm, in the package.yml, should I update like below since it is only changing the LinkedIn package?

packages:
  - package: fivetran/ad_reporting
    version: [">=1.7.0", "<1.8.0"]
#   Below is the temproary fix for linkedin source to avoid error "constant expressions are not supported in partition by clauses"
#   see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770)
  - git: https://github.com/fivetran/dbt_linkedin.git
    revision: test/redshift-constant-exp
    warn-unpinned: false

I have encountered the below issues with the ad_reporting package. How do I just pin the Linkedin package with this new revision? We also need the ad_reporting package. image

fivetran-catfritz commented 9 months ago

Hi @yuna-tang ahh thanks for pointing that out, it makes sense you're getting that error. Right now you won't be able to pin just the linkedin package. What you would need to do, just for testing purposes, is:

  1. Comment out the linkedin package and run dbt deps with the ad reporting package.
  2. Comment out the ad reporting package and re-add the linkedin github lines. Then run dbt deps again.
  3. Now it should run as normal

Let me know if that works!

yuna-tang commented 9 months ago

@fivetran-catfritz Let me try that in my pipeline, but it won't work for the deploy jobs, each time when the dbt job is run, it will run dbt deps per the package.yml file though. If I comment out the ad_reporting, it always means it won't get the latest update for the ad_reporting package if there are any updates.

For the permanent fix, I assume the ad_reporting package will pick up the LinkedIn package fix if it works?

For below method, I have encountered a new issue: image Below is the package details that I followed after step 2 per the below instructions. Do you think the conflicts is caused by the salesforce_formula_utils package ?

  1. Comment out the linkedin package and run dbt deps with the ad reporting package.
  2. Comment out the ad reporting package and re-add the linkedin github lines. Then run dbt deps again.
  3. Now it should run as normal
packages:
  - package: dbt-labs/dbt_utils
    version: [">=0.9.0", "<2.0.0"]
  - package: dbt-labs/audit_helper
    version: [">=0.9.0", "<1.0.0"]
  - package: dbt-labs/codegen
    version: [">=0.9.0", "<1.0.0"]
  - package: dbt-labs/redshift
    version: [">=0.9.0", "<1.0.0"]
  - package: calogica/dbt_expectations
    version: [">=0.10.0", "<0.11.0"]
#  - package: fivetran/ad_reporting
#    version: [">=1.7.0", "<1.8.0"]
#   Below is the temporary fix for linkedin source to avoid error "constant expressions are not supported in partition by clauses"
#   see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770), only change the Linkedin package.
  - git: https://github.com/fivetran/dbt_linkedin.git
    revision: test/redshift-constant-exp
    warn-unpinned: false
  - package: fivetran/salesforce_source
    version: [">=1.0.0", "<1.1.0"]
  - package: fivetran/salesforce_formula_utils
    version: [">=0.9.0", "<0.10.0"]

Also, we are planning to upgrade to dbt1.6 around 12th Jan, can we still rely on this pined version for the ad_reporting for now until we find a solution for the LinkedIn package?

  - git: https://github.com/fivetran/dbt_ad_reporting.git
    revision: test/update-partition-by
    warn-unpinned: false
fivetran-catfritz commented 9 months ago

Hi @yuna-tang thanks for trying that out! For the ad_reporting test branch, yes, you can continue using the test branch for ad_reporting. It will stay up until the official fix is released, and even after that we'll leave it up for a bit.

As for the current fix I'm asking you to test, yes this 2-step dbt deps process is only for testing. The final release is intended to apply to all our packages and would be installed normally. The reason we have to test this way is because I made changes to our base utility package, and it behaves differently than the last fix I made.

As for getting it to run in your local pipeline, I didn't think about your other packages. Before you run dbt deps the 2nd time, you will need to comment out all the packages that are not the linkedin git branch. Because as you figured out, there will be a conflict with our other packages. Alternatively, since you're running this locally, you could install only the linkedin git package just to test it out. I mostly am looking to know if it runs correctly in your warehouse before expanding it to all our packages.

yuna-tang commented 9 months ago

@fivetran-catfritz Thank you for keeping the test branch for ad_reporting. Please keep us updated when the official fix is released in the main ad_reporting package so we can make the package switch.

I have followed the below steps in my local dev; however, I have reencountered the below error :( What's the next step?

02:39:56 Database Error in model stg_linkedin_ads__creative_history (models/stg_linkedin_adscreative_history.sql) constant expressions are not supported in partition by clauses compiled Code at target/run/linkedin_source/models/stg_linkedin_adscreative_history.sql

Below are the steps followed:

1. Run dbt deps after commenting out below linkedin package
#  - git: https://github.com/fivetran/dbt_linkedin.git
#    revision: test/redshift-constant-exp
#    warn-unpinned: false
2. Run dbt deps after commenting out all the other package except the linedin
  - git: https://github.com/fivetran/dbt_linkedin.git
    revision: test/redshift-constant-exp
    warn-unpinned: false
3. run  dbt build --select +stg_linkedin_ads__creative_history

Screeshots for step1: image image

Screeshots for step 2: image

Screenshots for step 3: image

fivetran-catfritz commented 9 months ago

Hi @yuna-tang thank you for testing that out again and providing the screenshots. I discussed this with my team, and looking at this from another angle, we noticed you are also using facebook_ads and google_ads, which have similar partitioning logic. Did you experience any issue with the other ad packages? For example, if you run facebook_ads only in your local environment, do you get the same error?

yuna-tang commented 9 months ago

Hi @fivetran-catfritz , I have done the following testing against facebook_ads package & google_ads package against my local dev environment and it has no issue with the partition by clauses. I think we only experienced the issue with linkedin_ads package. Can you please advise the next step and how we can help to trouble shoot?

  1. test against facebook_ads
    • model created
      --"datamapping"."dev_yuna_facebook_ads_source"."stg_facebook_ads__creative_history"
      final as (
      row_number() over (partition by source_relation, id order by _fivetran_synced desc) = 1 as is_most_recent_record
      from fields)

      image

  1. test against google_ads
    • model created
      -- "datamapping"."dev_yuna_google_ads_source"."stg_google_ads__ad_group_criterion_history"
      final as (
      row_number() over (partition by source_relation, id order by updated_at desc) = 1 as is_most_recent_record
      from fields)

      image

yuna-tang commented 9 months ago

Hi @fivetran-joemarkiewicz, @fivetran-catfritz, we didn't modify the google_ads & facebook ads. For linkedin_ads, we saw a blank coming under source_relation. See screenshot below for example. image

Hi @yuna-tang thanks for the response and provided details! Out of curiosity, I see from the results you shared that your source_relation field is showing as a 0 instead of an empty string. Did you make any modifications to have the results of the source relation field be an integer? Is a 0 also what you see when you query a LinkedIn Ads *_tmp model?

fivetran-joemarkiewicz commented 9 months ago

Apologies @yuna-tang it seems @fivetran-catfritz added that integer to see if that would avoid the constant expression issue, but I was unaware of that change upon my previous response.

It is quite perplexing that the same logic (which does contain constant expressions) for the other ad platforms works just fine, but then errors out with LinkedIn. I wonder if something for LinkedIn got mixed at one point in time with the schema the dbt models are writing to and that resulted in some confusion during materialization. Currently, I am hesitant for us to move forward with a fix as we are still unsure what is causing the issue. Additionally, the original fix @fivetran-catfritz proposed (while effective) will require a significant amount of effort to update. I would like to ensure we are 100% sure this is the only option before proceeding.

Would you be able to attempt running the models on a completely new schema? This way we can ensure that on a fresh schema the same error does persist. Or if it does not, then it may be due to the schema having some issue introduced at one point. You can write the models to a new/fresh schema by simply adding the following config to your dbt_project.yml.

models:
   +schema: testing_schema

This will result in all of the models from the package being materialized in a schema named dev_yuna_testing_schema. This way you will be able to simply delete the entire schema once you are done with this test.

yuna-tang commented 9 months ago

@fivetran-joemarkiewicz Thank you for the update. For the new testing schema that you are suggesting us to run, shall I maintain the original package configuration or still point to the below package that @fivetran-catfritz proposed?

-- @fivetran-catfritz proposed package

2. Run dbt deps after commenting out all the other package except the linedin
  - git: https://github.com/fivetran/dbt_linkedin.git
    revision: test/redshift-constant-exp
    warn-unpinned: false

Below is the dbt package config. I will comment out the temporary fix.


#   Below is the temporary fix for linkedin source to avoid error "constant expressions are not supported in partition by clauses"
#   see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770),fivetran proposed new linkedin package update only
#  - git: https://github.com/fivetran/dbt_ad_reporting.git
#    revision: test/update-partition-by
#    warn-unpinned: false
-- our original package configuration for dbt1.6 upgrade

packages:
  - package: dbt-labs/dbt_utils
    version: [">=0.9.0", "<2.0.0"]
  - package: dbt-labs/audit_helper
    version: [">=0.9.0", "<1.0.0"]
  - package: dbt-labs/codegen
    version: [">=0.9.0", "<1.0.0"]
  - package: dbt-labs/redshift
    version: [">=0.9.0", "<1.0.0"]
  - package: calogica/dbt_expectations
    version: [">=0.10.0", "<0.11.0"]
  - package: fivetran/ad_reporting
    version: [">=1.7.0", "<1.8.0"]
#   Below is the temporary fix for linkedin source to avoid error "constant expressions are not supported in partition by clauses"
#   see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770),fivetran proposed new linkedin package update only
#  - git: https://github.com/fivetran/dbt_ad_reporting.git
#    revision: test/update-partition-by
#    warn-unpinned: false
  - package: fivetran/salesforce_source
    version: [">=1.0.0", "<1.1.0"]
  - package: fivetran/salesforce_formula_utils
    version: [">=0.9.0", "<0.10.0"]
fivetran-joemarkiewicz commented 9 months ago

Yeah I would recommend using the official version of the ad reporting package as opposed to the branch with @fivetran-catfritz for this test. I would like to see if moving to a fresh schema will work as expected with the live version of the package before exploring some further changes in the branch. Thanks!

yuna-tang commented 9 months ago

@fivetran-joemarkiewicz I started a brand new test schema dev_yuna_test_linkedin, the same error still exists in linkedin models.

06:32:26  Completed with 1 error and 0 warnings:
06:32:26  
06:32:26    Database Error in model stg_linkedin_ads__creative_history (models/stg_linkedin_ads__creative_history.sql)
  constant expressions are not supported in partition by clauses
  compiled Code at target/run/linkedin_source/models/stg_linkedin_ads__creative_history.sql

Below is the steps that I have followed:

packages:
  - package: dbt-labs/dbt_utils
    version: [">=0.9.0", "<2.0.0"]
  - package: dbt-labs/audit_helper
    version: [">=0.9.0", "<1.0.0"]
  - package: dbt-labs/codegen
    version: [">=0.9.0", "<1.0.0"]
  - package: dbt-labs/redshift
    version: [">=0.9.0", "<1.0.0"]
  - package: calogica/dbt_expectations
    version: [">=0.10.0", "<0.11.0"]
  - package: fivetran/ad_reporting
    version: [">=1.7.0", "<1.8.0"]
#   Below is the temporary fix for linkedin source to avoid error "constant expressions are not supported in partition by clauses"
#   see github [issue](https://github.com/fivetran/dbt_linkedin/issues/33#issuecomment-1828973770), only change the Linkedin package.
#  - git: https://github.com/fivetran/dbt_linkedin.git
#    revision: test/redshift-constant-exp
#    warn-unpinned: false
  - package: fivetran/salesforce_source
    version: [">=1.0.0", "<1.1.0"]
  - package: fivetran/salesforce_formula_utils
    version: [">=0.9.0", "<0.10.0"]

We don't have any other configuration related to ad_reporting in dbt_project.yml apart from the following:

vars:
  ad_reporting__microsoft_ads_enabled: True
  microsoft_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__linkedin_ads_enabled: True
  linkedin_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__google_ads_enabled: True
  google_ads_database: datamapping # manually added for serverless datashare
  #The Adwords API has since been deprecated by Google and is now no longer the standard API for the Google Ads connector.
  google_ads_schema: adwords_new_api
  ad_reporting__twitter_ads_enabled: True
  twitter_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__facebook_ads_enabled: True
  facebook_ads_database: datamapping # manually added for serverless datashare
  ad_reporting__snapchat_ads_enabled: False
  ad_reporting__tiktok_ads_enabled: False
  ad_reporting__pinterest_ads_enabled: False
  ad_reporting__reddit_ads_enabled: False
  ad_reporting__amazon_ads_enabled: False
yuna-tang commented 8 months ago

Hey @fivetran-joemarkiewicz @fivetran-catfritz Just to follow up on my previous post. What is the recommended next step for us to fix this issue with Redshift?

fivetran-catfritz commented 8 months ago

Hi @yuna-tang would you be available for another call with me and @fivetran-joemarkiewicz to go through your setup more in depth? If so, you can schedule a time on my calendar via this link. Thanks!

yuna-tang commented 8 months ago

@fivetran-catfritz @fivetran-joemarkiewicz Thank you for the messages, I have scheduled a meeting to go through my setup

fivetran-catfritz commented 8 months ago

Hi @yuna-tang thank you for meeting with us again and helping us find the root cause of the issue. I have updated the original test branch, so you should be able to go back to using this branch in the meantime.

- git: https://github.com/fivetran/dbt_ad_reporting.git
    revision: test/update-partition-by
    warn-unpinned: false

We can now move forward with a release and will keep you posted, but keep using this branch until then.

Notes from the call:

yuna-tang commented 8 months ago

Hi @fivetran-catfritz , I can confirm the new fix in the original test branch. stg_linkedin_ads__creative_history is working. See the below screenshot for your information. image image

-- see sample id ='46532866' in stg_linkedin_ads__creative_history, picking up '1970-01-01 00:00:00.000000' as the last_modified_at for the historical records synced before 2023-04-13 image

fivetran-catfritz commented 8 months ago

Hi @yuna-tang thanks for the screenshots, it clearly illustrates what is going on! I have picked up this issue and aim to release the update by the end of next week. Thanks again!

yuna-tang commented 8 months ago

Hi @fivetran-catfritz , thank you for the confirmation. I also want to check if Fivetran uses Epoch time, when it is 0 or empty, it will return 1970-01-01. Do you think this is part of the reason the date 1970-01-01 is filled for the new column last_modified_at? image

fivetran-catfritz commented 8 months ago

Hi @yuna-tang, I talked to our engineering team about this, and they mentioned that they deliberately set the back-fill values to be 1970-01-01, but I'm not really sure the details outside that.

But good news! I have released the patch today, so it should automatically get picked up the next time you run dbt deps with the normal ad_reporting package. Thanks again for all your help troubleshooting. Please also let me know if everything runs smoothly for you now.

yuna-tang commented 8 months ago

Hi @fivetran-catfritz . Thank you for the update :) I will update the package to point to the normal ad_reporting package and run the test against our pipeline.

yuna-tang commented 8 months ago

Hi @fivetran-catfritz I think we can close this bug now. The patch has been run successfully with our production pipeline. Thank you for all the help!

fivetran-catfritz commented 8 months ago

Hi @yuna-tang hooray that's great, and thank you for confirming!! Closing out the issue. 😃