Closed martelli closed 1 year ago
Thanks for your pull request, and welcome to our community! We require contributors to sign our Contributor License Agreement and we don't seem to have your signature on file. Check out this article for more information on why we have a CLA.
In order for us to review and merge your code, please submit the Individual Contributor License Agreement form attached above above. If you have questions about the CLA, or if you believe you've received this message in error, please reach out through a comment on this PR.
CLA has not been signed by users: @martelli
duplicate of https://github.com/dbt-labs/dbt-spark/pull/848
Fixed in #848 .
The call to
tblproperties_clause()
is missing in the "CREATE TABLE" statement building. This is needed in order to be able to use DBT-Spark on top of Iceberg/S3, as we need to pass parameters at table creation time.Problem
When creating tables on Spark/Iceberg/S3, DBT will not honor the
TBLPROPERTIES
defined in the config. There are various properties that need to be set atCREATE TABLE
time. In my particular case, I need to enablewrite.object-storage.enabled
to avoid being throttled by AWS S3.Solution
By including the call to
tblproperties_clause
inside the macrospark__create_table_as
, the properties will be included in theCREATE TABLE
statement, thus fixing the issue.Checklist
Test Plan:
By using the following config in the example model:
It produced: