airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
15.77k stars 4.04k forks source link

Airbyte with dbt error for snowflake #11646

Closed octavia-squidington-iii closed 2 years ago

octavia-squidington-iii commented 2 years ago

Following error occurs while using dbt in the airbyte GUI. But when the same ‘dbt run’ is performed manually through command line it is succesful.

2022-03-24 17:46:24 e[32mINFOe[m i.a.w.w.WorkerRun(call):49 - Executing worker wrapper. Airbyte version: 0.35.45-alpha
2022-03-24 17:46:26 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/82/1/logs.log
2022-03-24 17:46:26 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.35.45-alpha
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):103 - start sync worker. job id: 82 attempt id: 1
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):115 - configured sync modes: {null.covid_epidemology=full_refresh - append}
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteDestination(start):69 - Running destination...
2022-03-24 17:46:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.17 exists...
2022-03-24 17:46:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.17 was found locally.
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):104 - Creating docker job ID: 82
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):155 - Preparing command: docker run --rm --init -i -w /data/82/1 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/destination-snowflake:0.4.17 write --config destination_config.json --catalog destination_catalog.json
2022-03-24 17:46:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-file:0.2.9 exists...
2022-03-24 17:46:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-file:0.2.9 was found locally.
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):104 - Creating docker job ID: 82
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):155 - Preparing command: docker run --rm --init -i -w /data/82/1 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/source-file:0.2.9 read --config source_config.json --catalog source_catalog.json
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):157 - Waiting for source and destination threads to complete.
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$6):338 - Destination output thread started.
2022-03-24 17:46:27 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):278 - Replication thread started.
2022-03-24 17:46:28 e[43mdestinatione[0m > SLF4J: Class path contains multiple SLF4J bindings.
2022-03-24 17:46:28 e[43mdestinatione[0m > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-03-24 17:46:28 e[43mdestinatione[0m > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-03-24 17:46:28 e[43mdestinatione[0m > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2022-03-24 17:46:29 e[43mdestinatione[0m > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2022-03-24 17:46:30 e[44msourcee[0m > Reading covid_epidemology (https://storage.googleapis.com/covid19-open-data/v2/latest/epidemiology.csv)...
2022-03-24 17:46:31 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 1000
2022-03-24 17:46:31 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 2000
2022-03-24 17:46:31 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 3000
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.b.IntegrationRunner(run):88 - Sentry transaction event: 966eeef97c534f82ac6742630f3ac90c
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):106 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):107 - Command: WRITE
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):108 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$toWriteConfig$0):92 - Write config: WriteConfig{streamName=covid_epidemology, namespace=null, outputSchemaName=AIRBYTE_SCHEMA, tmpTableName=_airbyte_tmp_ixg_covid_epidemology, outputTableName=_airbyte_raw_covid_epidemology, syncMode=append}
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.d.b.BufferedStreamConsumer(startTracked):141 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onStartFunction$2):111 - Preparing tmp tables in destination started for 1 streams
2022-03-24 17:46:32 e[43mdestinatione[0m > 2022-03-24 17:46:32 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onStartFunction$2):119 - Preparing stage in destination started for schema AIRBYTE_SCHEMA stream covid_epidemology: tmp table: _airbyte_tmp_ixg_covid_epidemology, stage: AIRBYTE_SCHEMA_AIRBYTE_RAW_COVID_EPIDEMOLOGY
2022-03-24 17:46:37 e[43mdestinatione[0m > 2022-03-24 17:46:37 e[32mINFOe[m i.a.d.j.DefaultJdbcDatabase(lambda$query$1):106 - closing connection
2022-03-24 17:46:37 e[43mdestinatione[0m > 2022-03-24 17:46:37 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onStartFunction$2):130 - Preparing stage in destination completed for schema AIRBYTE_SCHEMA stream covid_epidemology
2022-03-24 17:46:37 e[43mdestinatione[0m > 2022-03-24 17:46:37 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onStartFunction$2):133 - Preparing tables in destination completed.
2022-03-24 17:46:38 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 4000
2022-03-24 17:46:38 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 5000
2022-03-24 17:46:38 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 6000
2022-03-24 17:46:38 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 7000
2022-03-24 17:46:38 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 8000
2022-03-24 17:46:38 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 9000
2022-03-24 17:46:39 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 10000
2022-03-24 17:46:39 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 11000
2022-03-24 17:46:39 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 12000
2022-03-24 17:46:39 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 13000
2022-03-24 17:46:39 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 14000
2022-03-24 17:46:39 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 15000
2022-03-24 17:46:39 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 16000
2022-03-24 17:46:40 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):300 - Records read: 17000
2022-03-24 17:46:40 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):304 - Total records read: 17911
2022-03-24 17:46:40 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):162 - One of source or destination thread complete. Waiting on the other.
2022-03-24 17:46:40 e[43mdestinatione[0m > 2022-03-24 17:46:40 e[32mINFOe[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded.
2022-03-24 17:46:40 e[43mdestinatione[0m > 2022-03-24 17:46:40 e[32mINFOe[m i.a.i.d.b.BufferedStreamConsumer(close):217 - executing on success close procedure.
2022-03-24 17:46:40 e[43mdestinatione[0m > 2022-03-24 17:46:40 e[32mINFOe[m i.a.i.d.b.BufferedStreamConsumer(flushQueueToDestination):181 - Flushing buffer: 14792692 bytes
2022-03-24 17:46:40 e[43mdestinatione[0m > 2022-03-24 17:46:40 e[32mINFOe[m i.a.i.d.b.BufferedStreamConsumer(lambda$flushQueueToDestination$1):185 - Flushing covid_epidemology: 17911 records
2022-03-24 17:46:40 e[43mdestinatione[0m > 2022-03-24 17:46:40 e[32mINFOe[m i.a.i.d.s.SnowflakeStagingSqlOperations(insertRecordsInternal):29 - Writing 17911 records to AIRBYTE_SCHEMA_AIRBYTE_RAW_COVID_EPIDEMOLOGY/STAGED/C26BF337-56AC-4178-81E5-AF7B1D230B09
2022-03-24 17:46:43 e[43mdestinatione[0m > 2022-03-24 17:46:43 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):173 - Finalizing tables in destination started for 1 streams
2022-03-24 17:46:43 e[43mdestinatione[0m > 2022-03-24 17:46:43 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):181 - Finalizing stream covid_epidemology. schema AIRBYTE_SCHEMA, tmp table _airbyte_tmp_ixg_covid_epidemology, final table _airbyte_raw_covid_epidemology, stage path AIRBYTE_SCHEMA_AIRBYTE_RAW_COVID_EPIDEMOLOGY/STAGED/C26BF337-56AC-4178-81E5-AF7B1D230B09
2022-03-24 17:46:44 e[43mdestinatione[0m > 2022-03-24 17:46:44 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):201 - Executing finalization of tables.
2022-03-24 17:46:46 e[43mdestinatione[0m > 2022-03-24 17:46:46 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):203 - Finalizing tables in destination completed.
2022-03-24 17:46:46 e[43mdestinatione[0m > 2022-03-24 17:46:46 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):205 - Cleaning tmp tables in destination started for 1 streams
2022-03-24 17:46:46 e[43mdestinatione[0m > 2022-03-24 17:46:46 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):209 - Cleaning tmp table in destination started for stream covid_epidemology. schema AIRBYTE_SCHEMA, tmp table name: _airbyte_tmp_ixg_covid_epidemology
2022-03-24 17:46:46 e[43mdestinatione[0m > 2022-03-24 17:46:46 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):215 - Cleaning stage in destination started for stream covid_epidemology. schema AIRBYTE_SCHEMA, stage: AIRBYTE_SCHEMA_AIRBYTE_RAW_COVID_EPIDEMOLOGY
2022-03-24 17:46:46 e[43mdestinatione[0m > 2022-03-24 17:46:46 e[32mINFOe[m i.a.i.d.s.SnowflakeInternalStagingConsumerFactory(lambda$onCloseFunction$4):219 - Cleaning tmp tables and stages in destination completed.
2022-03-24 17:46:46 e[43mdestinatione[0m > 2022-03-24 17:46:46 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):154 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):164 - Source and destination threads complete.
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):227 - sync summary: io.airbyte.config.ReplicationAttemptSummary@10f3cdbf[status=completed,recordsSynced=17911,bytesSynced=3700921,startTime=1648143987025,endTime=1648144007742,totalStats=io.airbyte.config.SyncStats@44c03eb5[recordsEmitted=17911,bytesEmitted=3700921,stateMessagesEmitted=0,recordsCommitted=17911],streamStats=[io.airbyte.config.StreamSyncStats@6c815ac[streamName=covid_epidemology,stats=io.airbyte.config.SyncStats@1e208852[recordsEmitted=17911,bytesEmitted=3700921,stateMessagesEmitted=<null>,recordsCommitted=17911]]]]
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):249 - Source did not output any state messages
2022-03-24 17:46:47 e[33mWARNe[m i.a.w.DefaultReplicationWorker(run):260 - State capture: No state retained.
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$1):147 - sync summary: io.airbyte.config.StandardSyncOutput@530d8292[standardSyncSummary=io.airbyte.config.StandardSyncSummary@61d38f0b[status=completed,recordsSynced=17911,bytesSynced=3700921,startTime=1648143987025,endTime=1648144007742,totalStats=io.airbyte.config.SyncStats@44c03eb5[recordsEmitted=17911,bytesEmitted=3700921,stateMessagesEmitted=0,recordsCommitted=17911],streamStats=[io.airbyte.config.StreamSyncStats@6c815ac[streamName=covid_epidemology,stats=io.airbyte.config.SyncStats@1e208852[recordsEmitted=17911,bytesEmitted=3700921,stateMessagesEmitted=<null>,recordsCommitted=17911]]]],state=<null>,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@2b7d2a19[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@30668667[stream=io.airbyte.protocol.models.AirbyteStream@f408beb[name=covid_epidemology,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"key":{"type":["string","null"]},"date":{"type":["string","null"]},"new_tested":{"type":["number","null"]},"new_deceased":{"type":["number","null"]},"total_tested":{"type":["number","null"]},"new_confirmed":{"type":["number","null"]},"new_recovered":{"type":["number","null"]},"total_deceased":{"type":["number","null"]},"total_confirmed":{"type":["number","null"]},"total_recovered":{"type":["number","null"]}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=<null>,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=append,primaryKey=[],additionalProperties={}]],additionalProperties={}],failures=[]]
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating...
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/82/1/logs.log
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.35.45-alpha
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.DefaultNormalizationWorker(run):46 - Running normalization.
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.1.68
2022-03-24 17:46:47 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.1.68 exists...
2022-03-24 17:46:47 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.1.68 was found locally.
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):104 - Creating docker job ID: 82
2022-03-24 17:46:47 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):155 - Preparing command: docker run --rm --init -i -w /data/82/1/normalize --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/normalization-snowflake:0.1.68 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json
2022-03-24 17:46:48 e[42mnormalizatione[0m > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/82/1/normalize
2022-03-24 17:46:48 e[42mnormalizatione[0m > Namespace(config='destination_config.json', integration_type=<DestinationType.snowflake: 'snowflake'>, out='/data/82/1/normalize')
2022-03-24 17:46:48 e[42mnormalizatione[0m > transform_snowflake
2022-03-24 17:46:48 e[42mnormalizatione[0m > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/82/1/normalize --catalog destination_catalog.json --out /data/82/1/normalize/models/generated/ --json-column _airbyte_data
2022-03-24 17:46:48 e[42mnormalizatione[0m > Processing destination_catalog.json...
2022-03-24 17:46:48 e[42mnormalizatione[0m > Generating airbyte_ctes/AIRBYTE_SCHEMA/COVID_EPIDEMOLOGY_AB1.sql from covid_epidemology
2022-03-24 17:46:48 e[42mnormalizatione[0m > Generating airbyte_ctes/AIRBYTE_SCHEMA/COVID_EPIDEMOLOGY_AB2.sql from covid_epidemology
2022-03-24 17:46:48 e[42mnormalizatione[0m > Generating airbyte_ctes/AIRBYTE_SCHEMA/COVID_EPIDEMOLOGY_AB3.sql from covid_epidemology
2022-03-24 17:46:48 e[42mnormalizatione[0m > Generating airbyte_incremental/AIRBYTE_SCHEMA/COVID_EPIDEMOLOGY.sql from covid_epidemology
2022-03-24 17:46:48 e[42mnormalizatione[0m > detected no config file for ssh, assuming ssh is off.
2022-03-24 17:46:51 e[42mnormalizatione[0m > Running with dbt=0.21.1
2022-03-24 17:46:51 e[42mnormalizatione[0m > Unable to do partial parsing because ../build/partial_parse.msgpack not found
2022-03-24 17:46:53 e[42mnormalizatione[0m > [e[33mWARNINGe[0m]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
2022-03-24 17:46:53 e[42mnormalizatione[0m > There are 2 unused configuration paths:
2022-03-24 17:46:53 e[42mnormalizatione[0m > - models.airbyte_utils.generated.airbyte_tables
2022-03-24 17:46:53 e[42mnormalizatione[0m > - models.airbyte_utils.generated.airbyte_views
2022-03-24 17:46:53 e[42mnormalizatione[0m >
2022-03-24 17:46:53 e[42mnormalizatione[0m > Found 4 models, 0 tests, 0 snapshots, 0 analyses, 504 macros, 0 operations, 0 seed files, 1 source, 0 exposures
2022-03-24 17:46:53 e[42mnormalizatione[0m >
2022-03-24 17:46:55 e[42mnormalizatione[0m > 17:46:55 | Concurrency: 5 threads (target='prod')
2022-03-24 17:46:55 e[42mnormalizatione[0m > 17:46:55 |
2022-03-24 17:46:59 e[42mnormalizatione[0m > 17:46:59 | 1 of 1 START incremental model AIRBYTE_SCHEMA.COVID_EPIDEMOLOGY.............................................. [RUN]
2022-03-24 17:47:04 e[42mnormalizatione[0m > 17:47:04 | 1 of 1 OK created incremental model AIRBYTE_SCHEMA.COVID_EPIDEMOLOGY......................................... [e[32mSUCCESS 1e[0m in 5.03s]
2022-03-24 17:47:04 e[42mnormalizatione[0m > 17:47:04 |
2022-03-24 17:47:04 e[42mnormalizatione[0m > 17:47:04 | Finished running 1 incremental model in 11.61s.
2022-03-24 17:47:04 e[42mnormalizatione[0m >
2022-03-24 17:47:04 e[42mnormalizatione[0m > e[32mCompleted successfullye[0m
2022-03-24 17:47:04 e[42mnormalizatione[0m >
2022-03-24 17:47:04 e[42mnormalizatione[0m > Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.DefaultNormalizationWorker(run):69 - Normalization executed in 17 seconds.
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating...
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/82/1/logs.log
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.35.45-alpha
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.DbtTransformationWorker(run):44 - Running dbt transformation.
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.1.68
2022-03-24 17:47:05 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.1.68 exists...
2022-03-24 17:47:05 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.1.68 was found locally.
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):104 - Creating docker job ID: 82
2022-03-24 17:47:05 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):155 - Preparing command: docker run --rm --init -i -w /data/82/1/transform --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/normalization-snowflake:0.1.68 configure-dbt --integration-type snowflake --config destination_config.json --git-repo https://github.com/Bharath-63/custom_dbt.git
2022-03-24 17:47:05 e[42mnormalizatione[0m > Running: git clone --depth 5 --single-branch $GIT_REPO git_repo
2022-03-24 17:47:05 e[42mnormalizatione[0m > Cloning into 'git_repo'...
2022-03-24 17:47:06 e[42mnormalizatione[0m > fatal: could not read Username for 'https://github.com': No such device or address
2022-03-24 17:47:06 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):158 - Completing future exceptionally...
io.airbyte.workers.WorkerException: Dbt Transformation Failed.
    at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:16) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.airbyte.workers.WorkerException: DBT Transformation Failed.
    at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:54) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    ... 3 more
    Suppressed: io.airbyte.workers.WorkerException: Normalization process wasn't successful
        at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:159) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.DbtTransformationRunner.close(DbtTransformationRunner.java:125) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:43) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:16) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at java.lang.Thread.run(Thread.java:833) [?:?]
2022-03-24 17:47:06 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
2022-03-24 17:47:06 e[32mINFOe[m i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating...
2022-03-24 17:47:06 e[33mWARNe[m i.t.i.a.POJOActivityTaskHandler(activityFailureToResult):307 - Activity failure. ActivityId=f1548ba4-759d-3120-b813-0381df26ff0f, activityType=Run, attempt=1
java.lang.RuntimeException: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Dbt Transformation Failed.
    at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:233) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.sync.DbtTransformationActivityImpl.run(DbtTransformationActivityImpl.java:74) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:214) ~[temporal-sdk-1.8.1.jar:?]
    at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:180) ~[temporal-sdk-1.8.1.jar:?]
    at io.temporal.internal.activity.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:120) ~[temporal-sdk-1.8.1.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:204) ~[temporal-sdk-1.8.1.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:164) ~[temporal-sdk-1.8.1.jar:?]
    at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.8.1.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
    at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Dbt Transformation Failed.
    at io.temporal.serviceclient.CheckedExceptionWrapper.wrap(CheckedExceptionWrapper.java:56) ~[temporal-serviceclient-1.8.1.jar:?]
    at io.temporal.internal.sync.WorkflowInternal.wrap(WorkflowInternal.java:448) ~[temporal-sdk-1.8.1.jar:?]
    at io.temporal.activity.Activity.wrap(Activity.java:51) ~[temporal-sdk-1.8.1.jar:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:135) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.sync.DbtTransformationActivityImpl.lambda$run$1(DbtTransformationActivityImpl.java:100) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:228) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    ... 14 more
Caused by: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Dbt Transformation Failed.
    at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:129) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.sync.DbtTransformationActivityImpl.lambda$run$1(DbtTransformationActivityImpl.java:100) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:228) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    ... 14 more
Caused by: io.airbyte.workers.WorkerException: Dbt Transformation Failed.
    at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:16) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    ... 1 more
Caused by: io.airbyte.workers.WorkerException: DBT Transformation Failed.
    at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:54) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:16) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    ... 1 more
    Suppressed: io.airbyte.workers.WorkerException: Normalization process wasn't successful
        at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:159) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.DbtTransformationRunner.close(DbtTransformationRunner.java:125) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:43) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.DbtTransformationWorker.run(DbtTransformationWorker.java:16) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
        at java.lang.Thread.run(Thread.java:833) [?:?]
2022-03-24 17:47:06 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed.
errors: $.method: does not have a value in the enumeration [Standard]

[Discourse post]

octavia-squidington-iii commented 2 years ago

Marcos Marx commented: 2022-03-24 17:47:05 e[42mnormalizatione[0m > Running: git clone --depth 5 --single-branch $GIT_REPO git_repo 2022-03-24 17:47:05 e[42mnormalizatione[0m > Cloning into 'git_repo'... 2022-03-24 17:47:06 e[42mnormalizatione[0m > fatal: could not read Username for 'https://github.com': No such device or address

@Saisriram1 please read our docs explaining how to setup a custom dbt transformation: https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-airbyte#how-to-use-custom-dbt-tips

[Discourse post]

octavia-squidington-iii commented 2 years ago

Sai commented: We are receiving the below error

Encountered an error while reading the project: 2022-03-26 03:07:51 dbt > ERROR: Runtime Error 2022-03-26 03:07:51 dbt > at path []: Additional properties are not allowed ('model-paths', 'seed-paths' were unexpected) 2022-03-26 03:07:51 dbt > 2022-03-26 03:07:51 dbt > Error encountered in /data/98/0/transform/git_repo/dbt_project.yml 2022-03-26 03:07:51 dbt > Encountered an error: 2022-03-26 03:07:51 dbt > Runtime Error 2022-03-26 03:07:51 dbt > Could not run dbt Here is the dbt_project.yml file syntax, please help with work around/resolution

name: 'source_asdbt'version: '1.0.0'config-version: 2# This setting configures which "profile" dbt uses for this project._profile: 'source_asdbt'# These configurations specify where dbt should look for different types of files.# The model-paths config, for example, states that models in this project can be# found in the "models/" directory. You probably won't need to change these!model-paths: ["models"]analysis-paths: ["analyses"]test-paths: ["tests"]seed-paths: ["seeds"]macro-paths: ["macros"]snapshot-paths: ["snapshots"]target-path: "target" # directory which will store compiled SQL filesclean-targets: # directories to be removed by dbt clean_- "target"- "dbtpackages"# Configuring models# Full documentation:_ _https://docs.getdbt.com/docs/configuring-models# In this example config, we tell dbt to build all models in the example/ directory# as tables. These settings can be overridden in the individual model files# using the {{ config(...) }} macro._models: source_asdbt: # Config indicated by + and applies to all files under models/example/_example: +materialized: view

octavia-squidington-iii commented 2 years ago

Sai commented: Hi Team,

Please help with this issue.

Regards,

Sai Sriram

octavia-squidington-iii commented 2 years ago

Jerri Comeau commented: Hi Sai,

Have you checked against https://docs.getdbt.com/reference/dbt_project.yml to make sure your config declaration and your destination are set correctly?

octavia-squidington-iii commented 2 years ago

Marcos Marx commented: @Saisriram1 can I recommend to you watch a video explaining how to configure the custom dbt Demo Hour: Custom dbt Transformations - YouTube you need to add a previous step to install the correct deps.

[Discourse post]

octavia-squidington-iii commented 2 years ago

Sai commented: Thank you Marcos, this is sorted.

Regards,

Sai Sriram