dbt-labs / dbt-spark

dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricks
https://getdbt.com
Apache License 2.0
405 stars 227 forks source link

persist view column comments #866

Closed jurasan closed 1 year ago

jurasan commented 1 year ago

resolves #372

Problem

View column description is not persisted in view schema.

Solution

View column description is persisted as a column level comment. Used this dbt-snowflake commit as an example.

Checklist

I was not able to run tests on it. Here is the PR with the test, but in tests it cannot call dbt-core macros.

@internalcode
    def _fail_with_undefined_error(
        self, *args: t.Any, **kwargs: t.Any
    ) -> "te.NoReturn":
        """Raise an :exc:`UndefinedError` when operations are performed
        on the undefined value.
        """
>       raise self._undefined_exception(self._undefined_message)
E       jinja2.exceptions.UndefinedError: 'get_columns_in_query' is undefined

../../miniconda3/lib/python3.10/site-packages/jinja2/runtime.py:852: UndefinedError
cla-bot[bot] commented 1 year ago

Thanks for your pull request, and welcome to our community! We require contributors to sign our Contributor License Agreement and we don't seem to have your signature on file. Check out this article for more information on why we have a CLA.

In order for us to review and merge your code, please submit the Individual Contributor License Agreement form attached above above. If you have questions about the CLA, or if you believe you've received this message in error, please reach out through a comment on this PR.

CLA has not been signed by users: @jurasan

jurasan commented 1 year ago

@cla-bot check:

cla-bot[bot] commented 1 year ago

The cla-bot has been summoned, and re-checked this pull request!

colin-rogers-dbt commented 1 year ago

@jurasan can you add a changelog summarizing this change?

mikealfare commented 1 year ago

@jurasan The failing code check is for whitespace in dbt/include/spark/macros/adapters.sql. You can either fix it manually or run pre-commit run --all-files to have it fix it for you.

Also, I don't understand why the test in your linked PR wouldn't be able to run in this PR. We regularly call dbt-core macros. Could you talk more about that?

Fleid commented 1 year ago

@JCZuurmond for information only

jurasan commented 1 year ago

@mikealfare

@jurasan The failing code check is for whitespace in dbt/include/spark/macros/adapters.sql. You can either fix it manually or run pre-commit run --all-files to have it fix it for you.

fixed this

Also, I don't understand why the test in your linked PR wouldn't be able to run in this PR. We regularly call dbt-core macros. Could you talk more about that?

Yes, we do call dbt-core macros, but If you look at the test_macros.py I don't think those are tested. Maybe it's because of the way those macros are called in __run_macro method?

colin-rogers-dbt commented 1 year ago

Set this to auto-merge, looks like all tests are passing. @jurasan can you update your branch with the latest from main?

colin-rogers-dbt commented 1 year ago

resolved in #893