elementary-data / elementary

The dbt-native data observability solution for data & analytics engineers. Monitor your data pipelines in minutes. Available as self-hosted or cloud service with premium features.
https://www.elementary-data.com/
Apache License 2.0
1.94k stars 165 forks source link

If you loose connection to redshift the temp table disapears and cannot be loaded. #1550

Open clarkflowers opened 5 months ago

clarkflowers commented 5 months ago

Running with dbt=1.7.7 Registered adapter: redshift=1.7.2

I am having an issue where I get the error

15:54:19 Running 1 on-run-end hook 15:54:24 Elementary: Could not extract columns for table - NONE.DBT_MODELS__TMP_20240607155422928660155422936218 (might be a permissions issue)

So I reviewed the dbt log and found this

[0m15:23:27.577076 [debug] [Thread-1 (]: Began executing node model.elementary.dbt_models [0m15:23:27.707072 [debug] [Thread-1 (]: Using redshift connection "model.elementary.dbt_models" [0m15:23:27.708358 [debug] [Thread-1 (]: On model.elementary.dbt_models: / {"app": "dbt", "dbt_version": "1.8.1", "profile_name": "bremer_bank", "target_name": "SANDBOX", "node_id": "model.elementary.dbt_models"} / create temporary table "dbt_models__dbt_tmp152327636145" ... ) as empty_table where 1 = 0 ); [0m15:23:27.709114 [debug] [Thread-1 (]: Opening a new connection, currently in state closed [0m15:23:27.709877 [debug] [Thread-1 (]: Redshift adapter: Establishing connection using ssl with sslmode set to 'prefer'.To connect without ssl, set sslmode to 'disable'. [0m15:23:27.710376 [debug] [Thread-1 (]: Redshift adapter: Connecting to redshift with IAM based auth... [0m15:23:27.998888 [debug] [Thread-1 (]: SQL status: SUCCESS in 0.0 seconds [0m15:23:28.033597 [debug] [Thread-1 (]: Using redshift connection "model.elementary.dbt_models"

The next step is to pull the columns from the data dictionary. Since the code use temporary tables and the connection is lost to redshift the temp table is dropped. Then on the new connection there is no columns to lookup.

clarkflowers commented 5 months ago

For some reason when I run these commands from the dbt_log the columns do not show up as well.

create temporary table "dbt_models__dbt_tmp152327636145" as ( select * from ( select cast('dummy_string' as varchar(4096)) as unique_id, cast('dummy_string' as varchar(4096)) as alias, cast('dummy_string' as varchar(4096)) as checksum, cast('dummy_string' as varchar(4096)) as materialization, cast('this_is_just_a_long_dummy_string' as varchar(65535)) as tags,
cast('this_is_just_a_long_dummy_string' as varchar(65535)) as meta,
cast('dummy_string' as varchar(4096)) as owner,
cast('dummy_string' as varchar(4096)) as database_name,
cast('dummy_string' as varchar(4096)) as schema_name,
cast('this_is_just_a_long_dummy_string' as varchar(65535)) as depends_on_macros,
cast('this_is_just_a_long_dummy_string' as varchar(65535)) as depends_on_nodes,
cast('this_is_just_a_long_dummy_string' as varchar(65535)) as description,
cast('dummy_string' as varchar(4096)) as name,
cast('dummy_string' as varchar(4096)) as package_name,
cast('this_is_just_a_long_dummy_string' as varchar(65535)) as original_path,
cast('dummy_string' as varchar(4096)) as path,
cast('dummy_string' as varchar(4096)) as patch_path,
cast('dummy_string' as varchar(4096)) as generated_at,
cast('dummy_string' as varchar(4096)) as metadata_hash ) as empty_table where 1 = 0 );

select * from dbt_models__dbt_tmp152327636145; # make sure there

    with bound_views as (
    select
      ordinal_position,
      table_schema,
      column_name,
      data_type,
      character_maximum_length,
      numeric_precision,
      numeric_scale

    from information_schema."columns"
    where table_name = 'dbt_snapshots__dbt_tmp171637773276'
),

unbound_views as (
  select
    ordinal_position,
    view_schema,
    col_name,
    case
      when col_type ilike 'character varying%' then
        'character varying'
      when col_type ilike 'numeric%' then 'numeric'
      else col_type
    end as col_type,
    case
      when col_type like 'character%'
      then nullif(REGEXP_SUBSTR(col_type, '[0-9]+'), '')::int
      else null
    end as character_maximum_length,
    case
      when col_type like 'numeric%'
      then nullif(
        SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 1),
        '')::int
      else null
    end as numeric_precision,
    case
      when col_type like 'numeric%'
      then nullif(
        SPLIT_PART(REGEXP_SUBSTR(col_type, '[0-9,]+'), ',', 2),
        '')::int
      else null
    end as numeric_scale

  from pg_get_late_binding_view_cols()
  cols(view_schema name, view_name name, col_name name,
       col_type varchar, ordinal_position int)
  where view_name = 'dbt_snapshots__dbt_tmp171637773276'
),

external_views as (
  select
    columnnum,
    schemaname,
    columnname,
    case
      when external_type ilike 'character varying%' or external_type ilike 'varchar%'
      then 'character varying'
      when external_type ilike 'numeric%' then 'numeric'
      else external_type
    end as external_type,
    case
      when external_type like 'character%' or external_type like 'varchar%'
      then nullif(
        REGEXP_SUBSTR(external_type, '[0-9]+'),
        '')::int
      else null
    end as character_maximum_length,
    case
      when external_type like 'numeric%'
      then nullif(
        SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 1),
        '')::int
      else null
    end as numeric_precision,
    case
      when external_type like 'numeric%'
      then nullif(
        SPLIT_PART(REGEXP_SUBSTR(external_type, '[0-9,]+'), ',', 2),
        '')::int
      else null
    end as numeric_scale
  from
    pg_catalog.svv_external_columns
  where
    schemaname = 'None'
    and tablename = 'dbt_snapshots__dbt_tmp171637773276'

),

unioned as (
  select * from bound_views
  union all
  select * from unbound_views
  union all
  select * from external_views
)

select
  column_name,
  data_type,
  character_maximum_length,
  numeric_precision,
  numeric_scale

from unioned

order by ordinal_position;
Gunnnn commented 3 months ago

Same bug for me with Postgres, dbt=1.5.11, elementary-package=0.16.0, edr=0.16.1. Also there may be successful runs and after that not, floating error.