dbt-labs / dbt-core

dbt enables data analysts and engineers to transform their data using the same practices that software engineers use to build applications.
https://getdbt.com
Apache License 2.0
9.9k stars 1.63k forks source link

[CT-3163] [Bug] dbt snapshot with strategy='check' not working #8723

Closed sfwatergit closed 10 months ago

sfwatergit commented 1 year ago

Is this a new bug in dbt-core?

Current Behavior

In our development environment, I am running dbt snapshot with the strategy='check' on a table with an example record shown in the following screenshot: SQL_Editor (1) The snapshot is defined as follows

{% snapshot bdapi__entities_snapshot %}
    {{
        config(
            target_database='dap1_sources',
            unique_key='id',

            strategy='check',
            check_cols=['name','emailAddress', 'organization.id', 'organization.name','hub.id','hub.name', 'address', 'status', 'phoneNumber', 'primaryContact','organizationEntityConfiguration.lastUpdated']
        )

    }}

select *
from {{ source('bdapi', 'bdapi__entities') }}
{% endsnapshot %}

Note that even though there is a organizationEntityConfiguration.lastUpdated field, that field is not always populated, hence we do not use the timestamp strategy.

I can verify that nothing is changing in the source table (bdapi__entities) between dbt snapshot executions, and yet, each time I run dbt snapshot, a new row is added to the table. This is not the behavior I would expect.

Expected Behavior

Given the documentation. I would expect that a new row would be appended only when data changes for a given row. I have attached another screenshot showing that there are many entries for the same entity (with id=04fcbe57-9928-4121-9d78-38694310ab11) but with no differences between rows in checked (or for that matter, all) columns. I was not able to capture the dbt metadata fields in the screenshot, but the dbt_scd_id, dbt_updated_at, dbt_valid_from, and dbt_valid_to are all different as would be expected if there were any changes between snapshot executions.

dap1_sidney_feygin_staging_stg_telemetry_messages_preview_ (1)

Steps To Reproduce

  1. On Databricks ubuntu 20.04 cluster (13.1 Runtime)
  2. With dbt-core = 1.6.3; dbt-databricks=1.6.4
  3. Run dbt snapshot using snapshot configuration provided in current behavior.
  4. Observe that new snapshot row added with no difference between previously recorded row in checked fields.

Relevant log output

============================== 12:18:26.169758 | 1f3f02f7-8f79-4cb7-b515-d8f694742d97 ==============================
12:18:26.169758 [info ] [MainThread]: Running with dbt=1.6.3
12:18:26.170189 [debug] [MainThread]: running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'log_cache_events': 'False', 'write_json': 'True', 'partial_parse': 'True', 'cache_selected_only': 'False', 'warn_error': 'None', 'fail_fast': 'False', 'debug': 'False', 'log_path': '/Users/pz1zy9/dev/bd/repos/dig-databricks-dbt/transform/databricks_dbt/logs', 'version_check': 'True', 'profiles_dir': '/Users/pz1zy9/.dbt', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'log_format': 'default', 'invocation_command': 'dbt snapshot --target dap', 'introspect': 'True', 'static_parser': 'True', 'target_path': 'None', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'send_anonymous_usage_stats': 'True'}
12:18:29.585454 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'project_id', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x110bbebc0>]}
12:18:29.591810 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x13654c940>]}
12:18:29.592206 [info ] [MainThread]: Registered adapter: databricks=1.6.4
12:18:29.693543 [debug] [MainThread]: checksum: c5b12710b84b4499387d01421b0e554976a41fdd092f97aea2030a9d1b4e195d, vars: {}, profile: , target: dap, version: 1.6.3
12:18:30.188657 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 6 files changed.
12:18:30.190063 [debug] [MainThread]: Partial parsing: updated file: databricks_dbt://models/staging/genability/_genability__sources.yml
12:18:30.190437 [debug] [MainThread]: Partial parsing: updated file: databricks_dbt://models/staging/bdapi/_bdapi__sources.yml
12:18:30.192061 [debug] [MainThread]: Partial parsing: updated file: databricks_dbt://models/staging/evh/_evh__sources.yml
12:18:30.192887 [debug] [MainThread]: Partial parsing: updated file: databricks_dbt://models/staging/snapshots/_snapshots_sources.yml
12:18:30.193145 [debug] [MainThread]: Partial parsing: updated file: databricks_dbt://snapshots/bdapi/bdapi__assets_snapshot.sql
12:18:30.193383 [debug] [MainThread]: Partial parsing: updated file: databricks_dbt://snapshots/bdapi/bdapi__entities_snapshot.sql
12:18:30.827056 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x15c3040d0>]}
12:18:30.864319 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1379023b0>]}
12:18:30.864774 [info ] [MainThread]: Found 35 models, 4 seeds, 2 snapshots, 48 tests, 6 sources, 0 exposures, 0 metrics, 880 macros, 0 groups, 0 semantic models
12:18:30.865041 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1379023e0>]}
12:18:30.866877 [info ] [MainThread]: 
12:18:30.867542 [debug] [MainThread]: Acquiring new databricks connection 'master'
12:18:30.868335 [debug] [ThreadPool]: Acquiring new databricks connection 'list_dap1_sources'
12:18:30.868791 [debug] [ThreadPool]: Using databricks connection "list_dap1_sources"
12:18:30.869145 [debug] [ThreadPool]: On list_dap1_sources: GetSchemas(database=`dap1_sources`, schema=None)
12:18:30.869524 [debug] [ThreadPool]: Opening a new connection, currently in state init
12:18:32.459383 [debug] [ThreadPool]: SQL status: OK in 1.590000033378601 seconds
12:18:32.471744 [debug] [ThreadPool]: On list_dap1_sources: Close
12:18:32.792123 [debug] [ThreadPool]: Re-using an available connection from the pool (formerly list_dap1_sources, now create_dap1_sources_snapshots)
12:18:32.792935 [debug] [ThreadPool]: Creating schema "database: "dap1_sources"
schema: "snapshots"
"
12:18:32.801020 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
12:18:32.813061 [debug] [ThreadPool]: Using databricks connection "create_dap1_sources_snapshots"
12:18:32.813488 [debug] [ThreadPool]: On create_dap1_sources_snapshots: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "connection_name": "create_dap1_sources_snapshots"} */
create schema if not exists `dap1_sources`.`snapshots`

12:18:32.813750 [debug] [ThreadPool]: Opening a new connection, currently in state closed
12:18:35.276748 [debug] [ThreadPool]: SQL status: OK in 2.4600000381469727 seconds
12:18:35.284501 [debug] [ThreadPool]: Spark adapter: NotImplemented: commit
12:18:35.285453 [debug] [ThreadPool]: On create_dap1_sources_snapshots: ROLLBACK
12:18:35.286068 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
12:18:35.286585 [debug] [ThreadPool]: On create_dap1_sources_snapshots: Close
12:18:35.657061 [debug] [ThreadPool]: Re-using an available connection from the pool (formerly create_dap1_sources_snapshots, now list_dap1_sidney_feygin_marts)
12:18:35.659363 [debug] [ThreadPool]: Acquiring new databricks connection 'list_dap1_sources_snapshots'
12:18:35.660997 [debug] [ThreadPool]: Acquiring new databricks connection 'list_dap1_sidney_feygin_seeds'
12:18:35.672034 [debug] [ThreadPool]: Acquiring new databricks connection 'list_dap1_sidney_feygin_common'
12:18:35.675565 [debug] [ThreadPool]: Using databricks connection "list_dap1_sidney_feygin_marts"
12:18:35.678071 [debug] [ThreadPool]: Acquiring new databricks connection 'list_dap1_sidney_feygin_staging'
12:18:35.680222 [debug] [ThreadPool]: Using databricks connection "list_dap1_sources_snapshots"
12:18:35.682503 [debug] [ThreadPool]: Using databricks connection "list_dap1_sidney_feygin_seeds"
12:18:35.684521 [debug] [ThreadPool]: Using databricks connection "list_dap1_sidney_feygin_common"
12:18:35.684864 [debug] [ThreadPool]: On list_dap1_sidney_feygin_marts: GetTables(database=dap1_sidney_feygin, schema=marts, identifier=None)
12:18:35.780298 [debug] [ThreadPool]: Using databricks connection "list_dap1_sidney_feygin_staging"
12:18:35.780668 [debug] [ThreadPool]: On list_dap1_sources_snapshots: GetTables(database=dap1_sources, schema=snapshots, identifier=None)
12:18:35.780951 [debug] [ThreadPool]: On list_dap1_sidney_feygin_seeds: GetTables(database=dap1_sidney_feygin, schema=seeds, identifier=None)
12:18:35.781168 [debug] [ThreadPool]: On list_dap1_sidney_feygin_common: GetTables(database=dap1_sidney_feygin, schema=common, identifier=None)
12:18:35.781438 [debug] [ThreadPool]: Opening a new connection, currently in state closed
12:18:35.781839 [debug] [ThreadPool]: On list_dap1_sidney_feygin_staging: GetTables(database=dap1_sidney_feygin, schema=staging, identifier=None)
12:18:35.782106 [debug] [ThreadPool]: Opening a new connection, currently in state init
12:18:35.782329 [debug] [ThreadPool]: Opening a new connection, currently in state init
12:18:35.782536 [debug] [ThreadPool]: Opening a new connection, currently in state init
12:18:35.782933 [debug] [ThreadPool]: Opening a new connection, currently in state init
12:18:37.618764 [debug] [ThreadPool]: SQL status: OK in 1.840000033378601 seconds
12:18:37.640659 [debug] [ThreadPool]: On list_dap1_sidney_feygin_seeds: Close
12:18:37.648057 [debug] [ThreadPool]: SQL status: OK in 1.8700000047683716 seconds
12:18:37.648770 [debug] [ThreadPool]: SQL status: OK in 1.8700000047683716 seconds
12:18:37.649365 [debug] [ThreadPool]: SQL status: OK in 1.8700000047683716 seconds
12:18:37.655291 [debug] [ThreadPool]: On list_dap1_sidney_feygin_marts: Close
12:18:37.656748 [debug] [ThreadPool]: On list_dap1_sidney_feygin_common: Close
12:18:37.665678 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
12:18:37.666624 [debug] [ThreadPool]: Using databricks connection "list_dap1_sidney_feygin_staging"
12:18:37.666979 [debug] [ThreadPool]: On list_dap1_sidney_feygin_staging: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "connection_name": "list_dap1_sidney_feygin_staging"} */

      select current_catalog()

12:18:37.717965 [debug] [ThreadPool]: SQL status: OK in 1.940000057220459 seconds
12:18:37.720472 [debug] [ThreadPool]: On list_dap1_sources_snapshots: Close
12:18:37.873923 [debug] [ThreadPool]: Re-using an available connection from the pool (formerly list_dap1_sidney_feygin_marts, now list_dap1_sidney_feygin_prep)
12:18:37.883466 [debug] [ThreadPool]: Using databricks connection "list_dap1_sidney_feygin_prep"
12:18:37.884506 [debug] [ThreadPool]: On list_dap1_sidney_feygin_prep: GetTables(database=dap1_sidney_feygin, schema=prep, identifier=None)
12:18:37.885940 [debug] [ThreadPool]: Opening a new connection, currently in state closed
12:18:39.726237 [debug] [ThreadPool]: SQL status: OK in 1.840000033378601 seconds
12:18:39.736542 [debug] [ThreadPool]: On list_dap1_sidney_feygin_prep: Close
12:18:41.406664 [debug] [ThreadPool]: SQL status: OK in 3.740000009536743 seconds
12:18:41.448651 [debug] [ThreadPool]: Using databricks connection "list_dap1_sidney_feygin_staging"
12:18:41.449073 [debug] [ThreadPool]: On list_dap1_sidney_feygin_staging: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "connection_name": "list_dap1_sidney_feygin_staging"} */
show views in `dap1_sidney_feygin`.`staging`

12:18:42.768256 [debug] [ThreadPool]: SQL status: OK in 1.3200000524520874 seconds
12:18:42.783290 [debug] [ThreadPool]: On list_dap1_sidney_feygin_staging: ROLLBACK
12:18:42.784401 [debug] [ThreadPool]: Databricks adapter: NotImplemented: rollback
12:18:42.785231 [debug] [ThreadPool]: On list_dap1_sidney_feygin_staging: Close
12:18:43.288320 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x15c2c76d0>]}
12:18:43.293474 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
12:18:43.298283 [debug] [MainThread]: Spark adapter: NotImplemented: commit
12:18:43.300918 [info ] [MainThread]: Concurrency: 5 threads (target='dap')
12:18:43.306112 [info ] [MainThread]: 
12:18:43.314787 [debug] [Thread-1 (]: Began running node snapshot.databricks_dbt.bdapi__assets_snapshot
12:18:43.315302 [debug] [Thread-2 (]: Began running node snapshot.databricks_dbt.bdapi__entities_snapshot
12:18:43.315871 [info ] [Thread-1 (]: 1 of 2 START snapshot dap1_sources.snapshots.bdapi__assets_snapshot ............ [RUN]
12:18:43.316317 [info ] [Thread-2 (]: 2 of 2 START snapshot dap1_sources.snapshots.bdapi__entities_snapshot .......... [RUN]
12:18:43.317024 [debug] [Thread-1 (]: Re-using an available connection from the pool (formerly list_dap1_sidney_feygin_prep, now snapshot.databricks_dbt.bdapi__assets_snapshot)
12:18:43.317624 [debug] [Thread-2 (]: Re-using an available connection from the pool (formerly list_dap1_sources_snapshots, now snapshot.databricks_dbt.bdapi__entities_snapshot)
12:18:43.317988 [debug] [Thread-1 (]: Began compiling node snapshot.databricks_dbt.bdapi__assets_snapshot
12:18:43.318351 [debug] [Thread-2 (]: Began compiling node snapshot.databricks_dbt.bdapi__entities_snapshot
12:18:43.331338 [debug] [Thread-1 (]: Timing info for snapshot.databricks_dbt.bdapi__assets_snapshot (compile): 12:18:43.318730 => 12:18:43.330966
12:18:43.336244 [debug] [Thread-2 (]: Timing info for snapshot.databricks_dbt.bdapi__entities_snapshot (compile): 12:18:43.331722 => 12:18:43.335996
12:18:43.336893 [debug] [Thread-1 (]: Began executing node snapshot.databricks_dbt.bdapi__assets_snapshot
12:18:43.337402 [debug] [Thread-2 (]: Began executing node snapshot.databricks_dbt.bdapi__entities_snapshot
12:18:43.373649 [debug] [Thread-1 (]: Spark adapter: NotImplemented: add_begin_query
12:18:43.374299 [debug] [Thread-2 (]: Spark adapter: NotImplemented: add_begin_query
12:18:43.374600 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:43.374843 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:43.375096 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot`

12:18:43.375353 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot`

12:18:43.375932 [debug] [Thread-1 (]: Opening a new connection, currently in state closed
12:18:43.376261 [debug] [Thread-2 (]: Opening a new connection, currently in state closed
12:18:49.693741 [debug] [Thread-1 (]: SQL status: OK in 6.320000171661377 seconds
12:18:49.759572 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:49.760243 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */
select * from (
        select hardwareId, fleet, hub, organization, state, type, configuration, capabilities, timeZone, creationDate, lastUpdated, ready, faultCount, parent, host, diagnostics from (

select * from `dap1_sources`.bdapi.bdapi__assets

            ) subq
    ) as __dbt_sbq
    where false
    limit 0

12:18:49.768923 [debug] [Thread-2 (]: SQL status: OK in 6.389999866485596 seconds
12:18:49.775819 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:49.776466 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */
select * from (
        select name, emailAddress, organization.id, organization.name, hub.id, hub.name, address, status, phoneNumber, primaryContact, organizationEntityConfiguration.lastUpdated from (

select *
from `dap1_sources`.bdapi.bdapi__entities

            ) subq
    ) as __dbt_sbq
    where false
    limit 0

12:18:51.650302 [debug] [Thread-2 (]: SQL status: OK in 1.8700000047683716 seconds
12:18:51.668167 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:51.668761 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot`

12:18:51.707780 [debug] [Thread-1 (]: SQL status: OK in 1.9500000476837158 seconds
12:18:51.712571 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:51.712888 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot`

12:18:52.203693 [debug] [Thread-2 (]: SQL status: OK in 0.5299999713897705 seconds
12:18:52.233000 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:52.233451 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot`

12:18:52.433439 [debug] [Thread-1 (]: SQL status: OK in 0.7200000286102295 seconds
12:18:52.441088 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:52.441560 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot`

12:18:52.966969 [debug] [Thread-2 (]: SQL status: OK in 0.7300000190734863 seconds
12:18:52.984796 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:52.985293 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

        create or replace view `dap1_sources`.`snapshots`.`bdapi__entities_snapshot__dbt_tmp`

  as
    with snapshot_query as (

select *
from `dap1_sources`.bdapi.bdapi__entities

    ),

    snapshotted_data as (

        select *,
            id as dbt_unique_key

        from `dap1_sources`.`snapshots`.`bdapi__entities_snapshot`
        where dbt_valid_to is null

    ),

    insertions_source_data as (

        select
            *,
            id as dbt_unique_key,

    current_timestamp()
 as dbt_updated_at,

    current_timestamp()
 as dbt_valid_from,
            nullif(
    current_timestamp()
, 
    current_timestamp()
) as dbt_valid_to,
            md5(coalesce(cast(id as string ), '')
         || '|' || coalesce(cast(
    current_timestamp()
 as string ), '')
        ) as dbt_scd_id

        from snapshot_query
    ),

    updates_source_data as (

        select
            *,
            id as dbt_unique_key,

    current_timestamp()
 as dbt_updated_at,

    current_timestamp()
 as dbt_valid_from,

    current_timestamp()
 as dbt_valid_to

        from snapshot_query
    ),

    insertions as (

        select
            'insert' as dbt_change_type,
            source_data.*

        from insertions_source_data as source_data
        left outer join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key
        where snapshotted_data.dbt_unique_key is null
           or (
                snapshotted_data.dbt_unique_key is not null
            and (
                (
  TRUE
)
            )
        )

    ),

    updates as (

        select
            'update' as dbt_change_type,
            source_data.*,
            snapshotted_data.dbt_scd_id

        from updates_source_data as source_data
        join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key
        where (
            (
  TRUE
)
        )
    )

    select * from insertions
    union all
    select * from updates

12:18:53.077621 [debug] [Thread-1 (]: SQL status: OK in 0.6399999856948853 seconds
12:18:53.091628 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:53.097551 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

        create or replace view `dap1_sources`.`snapshots`.`bdapi__assets_snapshot__dbt_tmp`

  as
    with snapshot_query as (

select * from `dap1_sources`.bdapi.bdapi__assets

    ),

    snapshotted_data as (

        select *,
            id as dbt_unique_key

        from `dap1_sources`.`snapshots`.`bdapi__assets_snapshot`
        where dbt_valid_to is null

    ),

    insertions_source_data as (

        select
            *,
            id as dbt_unique_key,

    current_timestamp()
 as dbt_updated_at,

    current_timestamp()
 as dbt_valid_from,
            nullif(
    current_timestamp()
, 
    current_timestamp()
) as dbt_valid_to,
            md5(coalesce(cast(id as string ), '')
         || '|' || coalesce(cast(
    current_timestamp()
 as string ), '')
        ) as dbt_scd_id

        from snapshot_query
    ),

    updates_source_data as (

        select
            *,
            id as dbt_unique_key,

    current_timestamp()
 as dbt_updated_at,

    current_timestamp()
 as dbt_valid_from,

    current_timestamp()
 as dbt_valid_to

        from snapshot_query
    ),

    insertions as (

        select
            'insert' as dbt_change_type,
            source_data.*

        from insertions_source_data as source_data
        left outer join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key
        where snapshotted_data.dbt_unique_key is null
           or (
                snapshotted_data.dbt_unique_key is not null
            and (
                (snapshotted_data.`hardwareId` != source_data.`hardwareId`
        or
        (
            ((snapshotted_data.`hardwareId` is null) and not (source_data.`hardwareId` is null))
            or
            ((not snapshotted_data.`hardwareId` is null) and (source_data.`hardwareId` is null))
        ) or snapshotted_data.`fleet` != source_data.`fleet`
        or
        (
            ((snapshotted_data.`fleet` is null) and not (source_data.`fleet` is null))
            or
            ((not snapshotted_data.`fleet` is null) and (source_data.`fleet` is null))
        ) or snapshotted_data.`hub` != source_data.`hub`
        or
        (
            ((snapshotted_data.`hub` is null) and not (source_data.`hub` is null))
            or
            ((not snapshotted_data.`hub` is null) and (source_data.`hub` is null))
        ) or snapshotted_data.`organization` != source_data.`organization`
        or
        (
            ((snapshotted_data.`organization` is null) and not (source_data.`organization` is null))
            or
            ((not snapshotted_data.`organization` is null) and (source_data.`organization` is null))
        ) or snapshotted_data.`state` != source_data.`state`
        or
        (
            ((snapshotted_data.`state` is null) and not (source_data.`state` is null))
            or
            ((not snapshotted_data.`state` is null) and (source_data.`state` is null))
        ) or snapshotted_data.`type` != source_data.`type`
        or
        (
            ((snapshotted_data.`type` is null) and not (source_data.`type` is null))
            or
            ((not snapshotted_data.`type` is null) and (source_data.`type` is null))
        ) or snapshotted_data.`configuration` != source_data.`configuration`
        or
        (
            ((snapshotted_data.`configuration` is null) and not (source_data.`configuration` is null))
            or
            ((not snapshotted_data.`configuration` is null) and (source_data.`configuration` is null))
        ) or snapshotted_data.`capabilities` != source_data.`capabilities`
        or
        (
            ((snapshotted_data.`capabilities` is null) and not (source_data.`capabilities` is null))
            or
            ((not snapshotted_data.`capabilities` is null) and (source_data.`capabilities` is null))
        ) or snapshotted_data.`timeZone` != source_data.`timeZone`
        or
        (
            ((snapshotted_data.`timeZone` is null) and not (source_data.`timeZone` is null))
            or
            ((not snapshotted_data.`timeZone` is null) and (source_data.`timeZone` is null))
        ) or snapshotted_data.`creationDate` != source_data.`creationDate`
        or
        (
            ((snapshotted_data.`creationDate` is null) and not (source_data.`creationDate` is null))
            or
            ((not snapshotted_data.`creationDate` is null) and (source_data.`creationDate` is null))
        ) or snapshotted_data.`lastUpdated` != source_data.`lastUpdated`
        or
        (
            ((snapshotted_data.`lastUpdated` is null) and not (source_data.`lastUpdated` is null))
            or
            ((not snapshotted_data.`lastUpdated` is null) and (source_data.`lastUpdated` is null))
        ) or snapshotted_data.`ready` != source_data.`ready`
        or
        (
            ((snapshotted_data.`ready` is null) and not (source_data.`ready` is null))
            or
            ((not snapshotted_data.`ready` is null) and (source_data.`ready` is null))
        ) or snapshotted_data.`faultCount` != source_data.`faultCount`
        or
        (
            ((snapshotted_data.`faultCount` is null) and not (source_data.`faultCount` is null))
            or
            ((not snapshotted_data.`faultCount` is null) and (source_data.`faultCount` is null))
        ) or snapshotted_data.`parent` != source_data.`parent`
        or
        (
            ((snapshotted_data.`parent` is null) and not (source_data.`parent` is null))
            or
            ((not snapshotted_data.`parent` is null) and (source_data.`parent` is null))
        ) or snapshotted_data.`host` != source_data.`host`
        or
        (
            ((snapshotted_data.`host` is null) and not (source_data.`host` is null))
            or
            ((not snapshotted_data.`host` is null) and (source_data.`host` is null))
        ) or snapshotted_data.`diagnostics` != source_data.`diagnostics`
        or
        (
            ((snapshotted_data.`diagnostics` is null) and not (source_data.`diagnostics` is null))
            or
            ((not snapshotted_data.`diagnostics` is null) and (source_data.`diagnostics` is null))
        ))
            )
        )

    ),

    updates as (

        select
            'update' as dbt_change_type,
            source_data.*,
            snapshotted_data.dbt_scd_id

        from updates_source_data as source_data
        join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key
        where (
            (snapshotted_data.`hardwareId` != source_data.`hardwareId`
        or
        (
            ((snapshotted_data.`hardwareId` is null) and not (source_data.`hardwareId` is null))
            or
            ((not snapshotted_data.`hardwareId` is null) and (source_data.`hardwareId` is null))
        ) or snapshotted_data.`fleet` != source_data.`fleet`
        or
        (
            ((snapshotted_data.`fleet` is null) and not (source_data.`fleet` is null))
            or
            ((not snapshotted_data.`fleet` is null) and (source_data.`fleet` is null))
        ) or snapshotted_data.`hub` != source_data.`hub`
        or
        (
            ((snapshotted_data.`hub` is null) and not (source_data.`hub` is null))
            or
            ((not snapshotted_data.`hub` is null) and (source_data.`hub` is null))
        ) or snapshotted_data.`organization` != source_data.`organization`
        or
        (
            ((snapshotted_data.`organization` is null) and not (source_data.`organization` is null))
            or
            ((not snapshotted_data.`organization` is null) and (source_data.`organization` is null))
        ) or snapshotted_data.`state` != source_data.`state`
        or
        (
            ((snapshotted_data.`state` is null) and not (source_data.`state` is null))
            or
            ((not snapshotted_data.`state` is null) and (source_data.`state` is null))
        ) or snapshotted_data.`type` != source_data.`type`
        or
        (
            ((snapshotted_data.`type` is null) and not (source_data.`type` is null))
            or
            ((not snapshotted_data.`type` is null) and (source_data.`type` is null))
        ) or snapshotted_data.`configuration` != source_data.`configuration`
        or
        (
            ((snapshotted_data.`configuration` is null) and not (source_data.`configuration` is null))
            or
            ((not snapshotted_data.`configuration` is null) and (source_data.`configuration` is null))
        ) or snapshotted_data.`capabilities` != source_data.`capabilities`
        or
        (
            ((snapshotted_data.`capabilities` is null) and not (source_data.`capabilities` is null))
            or
            ((not snapshotted_data.`capabilities` is null) and (source_data.`capabilities` is null))
        ) or snapshotted_data.`timeZone` != source_data.`timeZone`
        or
        (
            ((snapshotted_data.`timeZone` is null) and not (source_data.`timeZone` is null))
            or
            ((not snapshotted_data.`timeZone` is null) and (source_data.`timeZone` is null))
        ) or snapshotted_data.`creationDate` != source_data.`creationDate`
        or
        (
            ((snapshotted_data.`creationDate` is null) and not (source_data.`creationDate` is null))
            or
            ((not snapshotted_data.`creationDate` is null) and (source_data.`creationDate` is null))
        ) or snapshotted_data.`lastUpdated` != source_data.`lastUpdated`
        or
        (
            ((snapshotted_data.`lastUpdated` is null) and not (source_data.`lastUpdated` is null))
            or
            ((not snapshotted_data.`lastUpdated` is null) and (source_data.`lastUpdated` is null))
        ) or snapshotted_data.`ready` != source_data.`ready`
        or
        (
            ((snapshotted_data.`ready` is null) and not (source_data.`ready` is null))
            or
            ((not snapshotted_data.`ready` is null) and (source_data.`ready` is null))
        ) or snapshotted_data.`faultCount` != source_data.`faultCount`
        or
        (
            ((snapshotted_data.`faultCount` is null) and not (source_data.`faultCount` is null))
            or
            ((not snapshotted_data.`faultCount` is null) and (source_data.`faultCount` is null))
        ) or snapshotted_data.`parent` != source_data.`parent`
        or
        (
            ((snapshotted_data.`parent` is null) and not (source_data.`parent` is null))
            or
            ((not snapshotted_data.`parent` is null) and (source_data.`parent` is null))
        ) or snapshotted_data.`host` != source_data.`host`
        or
        (
            ((snapshotted_data.`host` is null) and not (source_data.`host` is null))
            or
            ((not snapshotted_data.`host` is null) and (source_data.`host` is null))
        ) or snapshotted_data.`diagnostics` != source_data.`diagnostics`
        or
        (
            ((snapshotted_data.`diagnostics` is null) and not (source_data.`diagnostics` is null))
            or
            ((not snapshotted_data.`diagnostics` is null) and (source_data.`diagnostics` is null))
        ))
        )
    )

    select * from insertions
    union all
    select * from updates

12:18:54.637032 [debug] [Thread-2 (]: SQL status: OK in 1.649999976158142 seconds
12:18:54.646587 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:54.647030 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot__dbt_tmp`

12:18:54.754801 [debug] [Thread-1 (]: SQL status: OK in 1.649999976158142 seconds
12:18:54.758165 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:54.758531 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot__dbt_tmp`

12:18:55.316917 [debug] [Thread-2 (]: SQL status: OK in 0.6700000166893005 seconds
12:18:55.323255 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:55.323580 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot`

12:18:55.430562 [debug] [Thread-1 (]: SQL status: OK in 0.6700000166893005 seconds
12:18:55.458176 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:55.458870 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot`

12:18:55.913260 [debug] [Thread-2 (]: SQL status: OK in 0.5899999737739563 seconds
12:18:55.924715 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:55.925231 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot__dbt_tmp`

12:18:56.038909 [debug] [Thread-1 (]: SQL status: OK in 0.5799999833106995 seconds
12:18:56.060234 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:56.060710 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot__dbt_tmp`

12:18:56.421787 [debug] [Thread-2 (]: SQL status: OK in 0.5 seconds
12:18:56.425715 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:56.426083 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot`

12:18:56.703739 [debug] [Thread-1 (]: SQL status: OK in 0.6399999856948853 seconds
12:18:56.722580 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:56.722967 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot`

12:18:57.157513 [debug] [Thread-2 (]: SQL status: OK in 0.7300000190734863 seconds
12:18:57.173038 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:57.173898 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__entities_snapshot__dbt_tmp`

12:18:57.298888 [debug] [Thread-1 (]: SQL status: OK in 0.5799999833106995 seconds
12:18:57.324322 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:57.325250 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

      describe extended `dap1_sources`.`snapshots`.`bdapi__assets_snapshot__dbt_tmp`

12:18:57.747762 [debug] [Thread-2 (]: SQL status: OK in 0.5699999928474426 seconds
12:18:57.759028 [debug] [Thread-2 (]: Writing runtime SQL for node "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:57.762250 [debug] [Thread-1 (]: SQL status: OK in 0.4399999976158142 seconds
12:18:57.762826 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:18:57.763683 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */

          merge into `dap1_sources`.`snapshots`.`bdapi__entities_snapshot` as DBT_INTERNAL_DEST

      using `dap1_sources`.`snapshots`.`bdapi__entities_snapshot__dbt_tmp` as DBT_INTERNAL_SOURCE

    on DBT_INTERNAL_SOURCE.dbt_scd_id = DBT_INTERNAL_DEST.dbt_scd_id
    when matched
     and DBT_INTERNAL_DEST.dbt_valid_to is null
     and DBT_INTERNAL_SOURCE.dbt_change_type in ('update', 'delete')
        then update
        set dbt_valid_to = DBT_INTERNAL_SOURCE.dbt_valid_to

    when not matched
     and DBT_INTERNAL_SOURCE.dbt_change_type = 'insert'
        then insert *
    ;

12:18:57.766401 [debug] [Thread-1 (]: Writing runtime SQL for node "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:57.768409 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:18:57.768721 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */

          merge into `dap1_sources`.`snapshots`.`bdapi__assets_snapshot` as DBT_INTERNAL_DEST

      using `dap1_sources`.`snapshots`.`bdapi__assets_snapshot__dbt_tmp` as DBT_INTERNAL_SOURCE

    on DBT_INTERNAL_SOURCE.dbt_scd_id = DBT_INTERNAL_DEST.dbt_scd_id
    when matched
     and DBT_INTERNAL_DEST.dbt_valid_to is null
     and DBT_INTERNAL_SOURCE.dbt_change_type in ('update', 'delete')
        then update
        set dbt_valid_to = DBT_INTERNAL_SOURCE.dbt_valid_to

    when not matched
     and DBT_INTERNAL_SOURCE.dbt_change_type = 'insert'
        then insert *
    ;

12:19:29.989184 [debug] [Thread-1 (]: SQL status: OK in 32.209999084472656 seconds
12:19:30.117078 [debug] [Thread-2 (]: SQL status: OK in 32.349998474121094 seconds
12:19:30.438691 [debug] [Thread-1 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__assets_snapshot"
12:19:30.439681 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__assets_snapshot"} */
drop view if exists `dap1_sources`.`snapshots`.`bdapi__assets_snapshot__dbt_tmp`
12:19:30.556888 [debug] [Thread-2 (]: Using databricks connection "snapshot.databricks_dbt.bdapi__entities_snapshot"
12:19:30.559842 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: /* {"app": "dbt", "dbt_version": "1.6.3", "dbt_databricks_version": "1.6.4", "databricks_sql_connector_version": "2.9.3", "profile_name": "databricks_dbt", "target_name": "dap", "node_id": "snapshot.databricks_dbt.bdapi__entities_snapshot"} */
drop view if exists `dap1_sources`.`snapshots`.`bdapi__entities_snapshot__dbt_tmp`
12:19:31.937980 [debug] [Thread-1 (]: SQL status: OK in 1.5 seconds
12:19:31.991652 [debug] [Thread-1 (]: Spark adapter: NotImplemented: commit
12:19:31.996033 [debug] [Thread-1 (]: Timing info for snapshot.databricks_dbt.bdapi__assets_snapshot (execute): 12:18:43.337708 => 12:19:31.995656
12:19:31.996402 [debug] [Thread-2 (]: SQL status: OK in 1.4299999475479126 seconds
12:19:31.997114 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: ROLLBACK
12:19:31.999238 [debug] [Thread-2 (]: Spark adapter: NotImplemented: commit
12:19:31.999764 [debug] [Thread-1 (]: Databricks adapter: NotImplemented: rollback
12:19:32.000874 [debug] [Thread-2 (]: Timing info for snapshot.databricks_dbt.bdapi__entities_snapshot (execute): 12:18:43.363873 => 12:19:32.000719
12:19:32.001293 [debug] [Thread-1 (]: On snapshot.databricks_dbt.bdapi__assets_snapshot: Close
12:19:32.001708 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: ROLLBACK
12:19:32.002554 [debug] [Thread-2 (]: Databricks adapter: NotImplemented: rollback
12:19:32.003077 [debug] [Thread-2 (]: On snapshot.databricks_dbt.bdapi__entities_snapshot: Close
12:19:32.318657 [debug] [Thread-1 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x137df86a0>]}
12:19:32.320622 [debug] [Thread-2 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '1f3f02f7-8f79-4cb7-b515-d8f694742d97', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1053b7e80>]}
12:19:32.323058 [info ] [Thread-1 (]: 1 of 2 OK snapshotted dap1_sources.snapshots.bdapi__assets_snapshot ............ [OK in 49.00s]
12:19:32.324593 [info ] [Thread-2 (]: 2 of 2 OK snapshotted dap1_sources.snapshots.bdapi__entities_snapshot .......... [OK in 49.00s]
12:19:32.325707 [debug] [Thread-1 (]: Finished running node snapshot.databricks_dbt.bdapi__assets_snapshot
12:19:32.326747 [debug] [Thread-2 (]: Finished running node snapshot.databricks_dbt.bdapi__entities_snapshot
12:19:32.344040 [debug] [MainThread]: On master: ROLLBACK
12:19:32.345074 [debug] [MainThread]: Opening a new connection, currently in state init
12:19:33.306241 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
12:19:33.307109 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
12:19:33.307501 [debug] [MainThread]: Spark adapter: NotImplemented: commit
12:19:33.307881 [debug] [MainThread]: On master: ROLLBACK
12:19:33.308236 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
12:19:33.308573 [debug] [MainThread]: On master: Close
12:19:33.657713 [debug] [MainThread]: Connection 'master' was properly closed.
12:19:33.658685 [debug] [MainThread]: Connection 'snapshot.databricks_dbt.bdapi__assets_snapshot' was properly closed.
12:19:33.659264 [debug] [MainThread]: Connection 'snapshot.databricks_dbt.bdapi__entities_snapshot' was properly closed.
12:19:33.659968 [debug] [MainThread]: Connection 'list_dap1_sidney_feygin_seeds' was properly closed.
12:19:33.660854 [debug] [MainThread]: Connection 'list_dap1_sidney_feygin_common' was properly closed.
12:19:33.661408 [debug] [MainThread]: Connection 'list_dap1_sidney_feygin_staging' was properly closed.
12:19:33.665078 [info ] [MainThread]: 
12:19:33.666077 [info ] [MainThread]: Finished running 2 snapshots in 0 hours 1 minutes and 2.80 seconds (62.80s).
12:19:33.668864 [debug] [MainThread]: Command end result
12:19:33.725155 [info ] [MainThread]: 
12:19:33.725712 [info ] [MainThread]: Completed successfully
12:19:33.726195 [info ] [MainThread]: 
12:19:33.726537 [info ] [MainThread]: Done. PASS=2 WARN=0 ERROR=0 SKIP=0 TOTAL=2
12:19:33.727555 [debug] [MainThread]: Command `dbt snapshot` succeeded at 12:19:33.727457 after 67.63 seconds
12:19:33.728250 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x106b06b90>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x137947970>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x110bbebc0>]}
12:19:33.728697 [debug] [MainThread]: Flushing usage events

Environment

- OS:Ubuntu 20.04
- Python: 3.10
- dbt-core: 1.6.3
- dbt-databricks: 1.6.4

Which database adapter are you using with dbt?

spark

Additional Context

Asked in dbt slack and another user is experiencing the same issue. I didn't receive a reply from dbt devs, though.

dbeatty10 commented 1 year ago

Thanks for reaching out @sfwatergit !

Does this only happen to you for tables with columns that have quoted identifiers like "`faultCount`")?

i.e., does it work if you snapshot a table that only contains "plain" / unquoted column names (like "fault_count")?

github-actions[bot] commented 10 months ago

This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please comment on the issue or else it will be closed in 7 days.

github-actions[bot] commented 10 months ago

Although we are closing this issue as stale, it's not gone forever. Issues can be reopened if there is renewed community interest. Just add a comment to notify the maintainers.

github-actions[bot] commented 10 months ago

Although we are closing this issue as stale, it's not gone forever. Issues can be reopened if there is renewed community interest. Just add a comment to notify the maintainers.