{% if target.type == 'duckdb' %}
where try_cast('hcpcs_code' as integer) is not null
{% else %}
where {{ safe_cast('hcpcs_code', 'int')}} is not null
{% endif %}
DBT cross database macro appears not to be mapped to try_cast function (available in Spark SQL and Databricks SQL). I attempted revising to below, which still maps to cast().
{{ dbt.safe_cast("hcpcs_code", api.Column.translate_type("integer")) }}
Environment - Required
Tuva project package version (e.g. 0.6.0): 0.12.5
dbt version (e.g. 1.7): 1.8
dbt type (e.g. dbt cloud or dbt CLI): dbt CLI
Data warehouse (e.g. Snowflake): Databricks
To Reproduce
Steps to reproduce the behavior:
Clone latest Tuva project
Run locally against SQL Warehouse Pro, in Databricks
Expected behavior
Code works when try_cast() is called.
Additional context
Happy to address this ticket. Just need to test a revision w/ Databricks as target.type in profile.yml.
Describe the bug - Required DBT
safe_cast
macro fails on SQL Warehouse Pro in Databricks. Context in SlackIn service category staging code
DBT cross database macro appears not to be mapped to
try_cast
function (available in Spark SQL and Databricks SQL). I attempted revising to below, which still maps tocast()
. {{ dbt.safe_cast("hcpcs_code", api.Column.translate_type("integer")) }}Environment - Required
To Reproduce Steps to reproduce the behavior:
Expected behavior Code works when
try_cast()
is called.Additional context Happy to address this ticket. Just need to test a revision w/ Databricks as
target.type
inprofile.yml
.