Closed leo-schick closed 1 year ago
@leo-schick thanks for opening! this is something we should add (as an aside for dbt Labs maintainers, would we consider parity issues with SQL as bugs or enhancements?)
we may want to transfer this over to https://github.com/dbt-labs/dbt-spark, and then create a duplicate issue in https://github.com/databricks/databricks
edit: I'm going to go ahead and transfer to the Spark repo
I say it is an enhancement since it has never been defined as a feature that Apache Spark supports location_root for Python modes. Even though it looks straight forward that this should work, it obviously hasn’t been implemented in the first round so I guess it was not part of the original specs.
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please remove the stale label or comment on the issue, or it will be closed in 7 days.
Is this your first time submitting a feature request?
Describe the feature
Currently python models are always saved to the default location. Even when I set the
location_root
viait does not specify the exact location for the python model but still uses the
<schema>.<table_name>
logic: log output:It would be great when dbt would pass the location_root parameter to the write command. For example like this:
Describe alternatives you've considered
No response
Who will this benefit?
Users of the sources:
Are you interested in contributing this feature?
Unfortunately, I am not so deep into dbt to develop this by myself.
Anything else?
No response