Fixes a typo in upload_results() that causes batch sizes for uploading models to be too large for BigQuery only.
Update type - breaking / non-breaking
[x] Minor bug fix
[ ] Documentation improvements
[ ] Quality of Life improvements
[ ] New features (non-breaking change)
[ ] New features (breaking change)
[ ] Other (non-breaking change)
[ ] Other (breaking change)
[ ] Release preparation
What does this solve?
This typo caused the batch size for models in BigQuery to be too large, causing an error (due to a too-large query size) when uploading models. Testing the same changeset on a different fork in a separate (complex, but private) project successfully uploads models (where using the v2.6.2 release of dbt_artifacts causes the failure).
Outstanding questions
N/A, this is a straightforward fix, I think!
What databases have you tested with?
[ ] Snowflake
[x] Google BigQuery
[ ] Databricks
[ ] Spark
[ ] N/A
This changeset passes integration tests in BigQuery. I don't have access to other databases so haven't tested there (however, the change should only affect behaviour in BigQuery).
Overview
Fixes a typo in
upload_results()
that causes batch sizes for uploading models to be too large for BigQuery only.Update type - breaking / non-breaking
What does this solve?
This typo caused the batch size for models in BigQuery to be too large, causing an error (due to a too-large query size) when uploading models. Testing the same changeset on a different fork in a separate (complex, but private) project successfully uploads models (where using the v2.6.2 release of dbt_artifacts causes the failure).
Outstanding questions
N/A, this is a straightforward fix, I think!
What databases have you tested with?
This changeset passes integration tests in BigQuery. I don't have access to other databases so haven't tested there (however, the change should only affect behaviour in BigQuery).