Closed irsath closed 10 months ago
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please comment on the issue or else it will be closed in 7 days.
Although we are closing this issue as stale, it's not gone forever. Issues can be reopened if there is renewed community interest. Just add a comment to notify the maintainers.
Is this a new bug in dbt-spark?
Current Behavior
The
file_format
config is not taken into account. Table materialization is hard coded to be indelta
. https://github.com/dbt-labs/dbt-spark/blob/e741034160444eb7aa06aef7550a366cdcacc913/dbt/include/spark/macros/materializations/table.sql#L101My use case is to have AWS Glue as my global metastore. I have my databricks clusters linked to my glue metastore and I can successfully read / write tables in it. Unfortunately I'm not able to write anything else than delta table (as it is hardcoded)
Expected Behavior
The table is materialized in the format specified in the
file_format
config.Steps To Reproduce
Execute this model:
Relevant log output
No response
Environment
Additional Context
No response