Closed smitsrr closed 11 months ago
Hey @smitsrr ! thanks for opening the issue / discussion
There was a similar ask / discussion in https://github.com/get-select/dbt-snowflake-monitoring/issues/130#issuecomment-1743965491
IMO this isn't worth the added complexity / work. It may take a few hours on a xsmall warehouse the first time it runs (<$10), then after that full refresh it will be very quick (& super cheap) going forwards.
cc @NiallRees lemme know your thoughts
+1 here. It should be pretty simple with a project level var
and something like this in the where clauses:
where query_start_time >
{% if is_incremental() %}
(select coalesce(max(query_start_time), date '1970-01-01') from {{ this }})
{% else %}
{{ var("dbt_snowflake_monitoring_start_date") }}
{% endif %}
Other note worth putting here is that Snowflake only provides a year of query history anyway: https://docs.snowflake.com/en/sql-reference/account-usage/query_history
With that in mind, i retract my comment
I installed the package yesterday and was quite surprised by how long it took to execute the 'stg' queries (>5 minutes then I cancelled). Honestly though I don't need all-time to analyze query history - i'd be happy with the last 6-9 months.
Could you include an option in the form of a variable in the package for a 'full-refresh' date that can be overriden in the dbt_project.yml?