dbt-labs / dbt-redshift

dbt-redshift contains all of the code enabling dbt to work with Amazon Redshift
https://getdbt.com
Apache License 2.0
95 stars 58 forks source link

[Feature] Better error message for drop statement when user does not have access to table/view #897

Open victor-frank-signet opened 1 month ago

victor-frank-signet commented 1 month ago

Is this a new bug in dbt-redshift?

Current Behavior

When creating a view in redshift with jinja.

{{ 
    config(
    materialized = 'view',
    schema='test_schema'
    )
}}

with cte as (
    select * from testschema.test_table_a_tbl
)

select * from cte

The produced code is creating a a temp view and then sometimes running drop external if which is not valid redshift syntax:

  create view "db"."test_schema"."test_view_a_vw__dbt_tmp" as (

with cte as (
    select * from testschema.test_table_a_tbl
)

select * from cte
  ) ;

  drop external if exists "db"."test_schema"."test_view_a_vw" cascade

Expected Behavior

{{ 
    config(
    materialized = 'view',
    schema='test_schema'
    )
}}

with cte as (
    select * from testschema.test_table_a_tbl
)

select * from cte

Compiles to:

  create view "db"."test_schema"."test_view_a_vw__dbt_tmp" as (

with cte as (
    select * from testschema.test_table_a_tbl
)

select * from cte
  ) ;

  drop view if exists "db"."test_schema"."test_view_a_vw" cascade

Steps To Reproduce

  1. dbt-redshift 1.8.1 dbt-core==1.8.4|1.8.5
  2. dbt run --project-dir /tmp/dbt/test_project --profiles-dir /tmp/dbt/test_project --select testcomp.* --debug

Relevant log output

00:11:26  Began running node model.test_dir.test_view_a_vw
00:11:26  1 of 12 START sql view model test_schema_src.test_view_a_vw .... [RUN]
00:11:26  Re-using an available connection from the pool (formerly list_db_schema, now model.test_dir.test_view_a_vw)
00:11:26  Began compiling node model.test_dir.test_view_a_vw
00:11:26  Writing injected SQL for node "model.test_dir.test_view_a_vw"
00:11:26  Began executing node model.test_dir.test_view_a_vw
00:11:26  Writing runtime sql for node "model.test_dir.test_view_a_vw"
00:11:26  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:26  On model.test_dir.test_view_a_vw: BEGIN
00:11:26  Opening a new connection, currently in state closed
00:11:26  Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
00:11:26  Redshift adapter: Connecting to redshift with username/password based auth...
00:11:26  SQL status: SUCCESS in 0.064 seconds
00:11:26  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:26  On model.test_dir.test_view_a_vw: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "test_dir", "target_name": "env", "node_id": "model.test_dir.test_view_a_vw"} */

  create view "test_db"."test_schema_src"."test_view_a_vw__dbt_tmp" as (

with cte as (
    SELECT *
    FROM test_schema.test_schema_location_tbl ORG,

)

select 
    *
from cte
  ) ;
00:11:26  SQL status: SUCCESS in 0.033 seconds
00:11:26  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:26  On model.test_dir.test_view_a_vw: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "test_dir", "target_name": "env", "node_id": "model.test_dir.test_view_a_vw"} */
alter table "test_db"."test_schema_src"."test_view_a_vw" rename to "test_view_a_vw__dbt_backup"
00:11:26  SQL status: SUCCESS in 0.004 seconds
00:11:26  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:26  On model.test_dir.test_view_a_vw: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "test_dir", "target_name": "env", "node_id": "model.test_dir.test_view_a_vw"} */
alter table "test_db"."test_schema_src"."test_view_a_vw__dbt_tmp" rename to "test_view_a_vw"
00:11:26  SQL status: SUCCESS in 0.004 seconds
00:11:26  On model.test_dir.test_view_a_vw: COMMIT
00:11:26  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:26  On model.test_dir.test_view_a_vw: COMMIT
00:11:26  SQL status: SUCCESS in 0.460 seconds
00:11:26  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:26  On model.test_dir.test_view_a_vw: BEGIN
00:11:26  SQL status: SUCCESS in 0.003 seconds
00:11:26  Applying DROP to: "test_db"."test_schema_src"."test_view_a_vw__dbt_backup"
00:11:26  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:26  On model.test_dir.test_view_a_vw: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "test_dir", "target_name": "env", "node_id": "model.test_dir.test_view_a_vw"} */
drop view if exists "test_db"."test_schema_src"."test_view_a_vw__dbt_backup" cascade
00:11:27  SQL status: SUCCESS in 0.743 seconds
00:11:27  On model.test_dir.test_view_a_vw: COMMIT
00:11:27  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:27  On model.test_dir.test_view_a_vw: COMMIT
00:11:28  SQL status: SUCCESS in 0.472 seconds
00:11:28  Using redshift connection "model.test_dir.test_view_a_vw"
00:11:28  On model.test_dir.test_view_a_vw: BEGIN
00:11:28  SQL status: SUCCESS in 0.001 seconds
00:11:28  On model.test_dir.test_view_a_vw: ROLLBACK
00:11:28  On model.test_dir.test_view_a_vw: Close
00:11:28  Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '37206027-27cd-44a4-99ac-7e279f6c8dbd', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f1db1685650>]}
00:11:28  1 of 12 OK created sql view model test_schema_src.test_view_a_vw  [SUCCESS in 1.99s]
00:11:28  Finished running node model.test_dir.test_view_a_vw
00:11:28  Began running node model.test_dir.test_schema_time_dimension_vw
00:11:28  2 of 12 START sql view model test_schema_src.test_schema_time_dimension_vw ............. [RUN]
00:11:28  Re-using an available connection from the pool (formerly model.test_dir.test_view_a_vw, now model.test_dir.test_schema_time_dimension_vw)
00:11:28  Began compiling node model.test_dir.test_schema_time_dimension_vw
00:11:28  Writing injected SQL for node "model.test_dir.test_schema_time_dimension_vw"
00:11:28  Began executing node model.test_dir.test_schema_time_dimension_vw
00:11:28  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:28  On model.test_dir.test_schema_time_dimension_vw: BEGIN
00:11:28  Opening a new connection, currently in state closed
00:11:28  Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
00:11:28  Redshift adapter: Connecting to redshift with username/password based auth...
00:11:28  SQL status: SUCCESS in 0.081 seconds
00:11:28  Applying DROP to: "test_db"."test_schema_src"."test_schema_time_dimension_vw__dbt_tmp"
00:11:28  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:28  On model.test_dir.test_schema_time_dimension_vw: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "test_dir", "target_name": "env", "node_id": "model.test_dir.test_schema_time_dimension_vw"} */
drop view if exists "test_db"."test_schema_src"."test_schema_time_dimension_vw__dbt_tmp" cascade
00:11:28  SQL status: SUCCESS in 0.246 seconds
00:11:28  On model.test_dir.test_schema_time_dimension_vw: COMMIT
00:11:28  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:28  On model.test_dir.test_schema_time_dimension_vw: COMMIT
00:11:29  SQL status: SUCCESS in 1.450 seconds
00:11:29  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:29  On model.test_dir.test_schema_time_dimension_vw: BEGIN
00:11:29  SQL status: SUCCESS in 0.002 seconds
00:11:29  Writing runtime sql for node "model.test_dir.test_schema_time_dimension_vw"
00:11:29  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:29  On model.test_dir.test_schema_time_dimension_vw: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "test_dir", "target_name": "env", "node_id": "model.test_dir.test_schema_time_dimension_vw"} */

  create view "test_db"."test_schema_src"."test_schema_time_dimension_vw__dbt_tmp" as (

with cte as (
    select * from test_schema.test_schema_time_dimension_tbl
)

select * from cte
  ) ;
00:11:30  SQL status: SUCCESS in 0.039 seconds
00:11:30  On model.test_dir.test_schema_time_dimension_vw: COMMIT
00:11:30  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:30  On model.test_dir.test_schema_time_dimension_vw: COMMIT
00:11:30  SQL status: SUCCESS in 0.841 seconds
00:11:30  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:30  On model.test_dir.test_schema_time_dimension_vw: BEGIN
00:11:30  SQL status: SUCCESS in 0.008 seconds
00:11:30  Applying DROP to: "test_db"."test_schema_src"."test_schema_time_dimension_vw"
00:11:30  Using redshift connection "model.test_dir.test_schema_time_dimension_vw"
00:11:30  On model.test_dir.test_schema_time_dimension_vw: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "test_dir", "target_name": "env", "node_id": "model.test_dir.test_schema_time_dimension_vw"} */
drop external if exists "test_db"."test_schema_src"."test_schema_time_dimension_vw" cascade
00:11:30  Redshift adapter: Redshift error: syntax error at or near "if" in context "
drop external if", at line 2, column 15
00:11:30  On model.test_dir.test_schema_time_dimension_vw: ROLLBACK
00:11:30  Redshift adapter: Error running SQL: macro drop_relation
00:11:30  Redshift adapter: Rolling back transaction.
00:11:30  On model.test_dir.test_schema_time_dimension_vw: Close
00:11:30  Database Error in model test_schema_time_dimension_vw (models/comp/test_schema_time_dimension_vw.sql)
  syntax error at or near "if" in context "
  drop external if", at line 2, column 15
  compiled Code at target/run/test_dir/models/comp/test_schema_time_dimension_vw.sql
00:11:30  Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '37206027-27cd-44a4-99ac-7e279f6c8dbd', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f1db1685650>]}

Environment

- OS: Linux AMI
- Python: 3.9
- dbt-core: 1.8.4 or 1.8.5
- dbt-redshift: 1.8.1

Additional Context

No response

victor-frank-signet commented 1 month ago

This is probably no longer a bug. This was resolved on my end by changing the permissions on the view and the table, then dbt could resolve they are not external. But I still think this at least an enhancement but maybe a bug because if the view exists but dbt user does not have permission it should not produce a invalid sql "drop external if " instead it should say permission error while running the correct redshift sql stament "drop view|table if"

It has something to do with how the relation.type is populated in: �[0m00:11:30 Redshift adapter: Error running SQL: macro drop_relation