airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
16.13k stars 4.12k forks source link

Destination MSSQL: dbt normalization not escaping passwords correctly #34498

Closed ibrentlam closed 3 months ago

ibrentlam commented 9 months ago

Connector Name

destination-mssql

Connector Version

0.2.0

What step the error happened?

During the sync

Relevant information

When connecting to MS SQL server with a service account with a password of "5!x0#NGKVKm}rc[^,5Su" the initial connector setup passed, and with no transform: using Raw data (JSON), all is well. However when activating the Transformation "Normalized tabular data" the transform failed with: ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'SQLReporter'. (18456) (SQLDriverConnect)") and the DBA reported failed logins on the SQL Server side. The _airbyteraw* table was created but not populated.

After much cogitation, we changed the password to be straight alphanumeric, with no special characters. The dbt normalization worked first time.

My conclusion is that the process that escapes the password for the dbt transform isn't handling some character in the string given above properly.

Relevant log output

2024-01-24 18:21:03 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=10, successiveCompleteFailures=0, totalCompleteFailures=0, successivePartialFailures=9, totalPartialFailures=9)
2024-01-24 18:21:03 platform > Backing off for: 0 seconds.
2024-01-24 18:21:04 platform > Docker volume job log path: /tmp/workspace/384/9/logs.log
2024-01-24 18:21:04 platform > Executing worker wrapper. Airbyte version: 0.50.44
2024-01-24 18:21:04 platform > Attempt 0 to save workflow id for cancellation
2024-01-24 18:21:04 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:04 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:04 platform > 
2024-01-24 18:21:04 platform > ----- START CHECK -----
2024-01-24 18:21:04 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:04 platform > 
2024-01-24 18:21:04 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:04 platform > Using default value for environment variable LAUNCHDARKLY_KEY: ''
2024-01-24 18:21:04 platform > Checking if airbyte/source-snowflake:0.2.0 exists...
2024-01-24 18:21:04 platform > airbyte/source-snowflake:0.2.0 was found locally.
2024-01-24 18:21:04 platform > Creating docker container = source-snowflake-check-384-9-tgwgj with resources io.airbyte.config.ResourceRequirements@1b507250[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@33855888[hosts=[jcxbjjc-prod_5fc921f3695d4_sf.snowflakecomputing.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-01-24 18:21:04 platform > Preparing command: docker run --rm --init -i -w /data/384/9 --log-driver none --name source-snowflake-check-384-9-tgwgj --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-snowflake:0.2.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=9 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.44 -e WORKER_JOB_ID=384 airbyte/source-snowflake:0.2.0 check --config source_config.json
2024-01-24 18:21:04 platform > Reading messages from protocol version 0.2.0
2024-01-24 18:21:06 platform > INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 Running source under deployment mode: OSS
2024-01-24 18:21:06 platform > INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 Starting source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:07 platform > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}
2024-01-24 18:21:07 platform > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:07 platform > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: CHECK
2024-01-24 18:21:07 platform > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2024-01-24 18:21:07 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:07 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:07 platform > INFO i.a.i.s.s.SnowflakeSource(buildUsernamePasswordConfig):121 jdbc:snowflake://jcxbjjc-prod_5fc921f3695d4_sf.snowflakecomputing.com/?role=BBDATA_USER_ROLE&warehouse=BLACKBOARD_DATA_WH&database=BLACKBOARD_DATA_3FE097406ED44FEEADC552D05E1F8A7E&JDBC_QUERY_RESULT_FORMAT=JSON&CLIENT_SESSION_KEEP_ALIVE=true&application=airbyte_oss
2024-01-24 18:21:08 platform > INFO i.a.i.s.s.SnowflakeDataSourceUtils(createDataSource):77 Authorization mode is 'Username and password'
2024-01-24 18:21:08 platform > INFO c.z.h.HikariDataSource(getConnection):110 HikariPool-1 - Starting...
2024-01-24 18:21:10 platform > INFO c.z.h.p.HikariPool(checkFailFast):565 HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@597f0937
2024-01-24 18:21:10 platform > INFO c.z.h.HikariDataSource(getConnection):123 HikariPool-1 - Start completed.
2024-01-24 18:21:10 platform > INFO i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):138 Attempting to get metadata from the database to see if we can connect.
2024-01-24 18:21:10 platform > INFO c.z.h.HikariDataSource(close):350 HikariPool-1 - Shutdown initiated...
2024-01-24 18:21:11 platform > INFO c.z.h.HikariDataSource(close):352 HikariPool-1 - Shutdown completed.
2024-01-24 18:21:11 platform > INFO i.a.i.b.IntegrationRunner(runInternal):195 Completed integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:11 platform > INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):87 Completed source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:12 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@76e87b35[status=succeeded,message=<null>,additionalProperties={}]
2024-01-24 18:21:12 platform > 
2024-01-24 18:21:12 platform > ----- END CHECK -----
2024-01-24 18:21:12 platform > 
2024-01-24 18:21:12 platform > Docker volume job log path: /tmp/workspace/384/9/logs.log
2024-01-24 18:21:12 platform > Executing worker wrapper. Airbyte version: 0.50.44
2024-01-24 18:21:12 platform > Attempt 0 to save workflow id for cancellation
2024-01-24 18:21:12 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:12 platform > 
2024-01-24 18:21:12 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:12 platform > ----- START CHECK -----
2024-01-24 18:21:12 platform > 
2024-01-24 18:21:12 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:12 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:12 platform > Using default value for environment variable LAUNCHDARKLY_KEY: ''
2024-01-24 18:21:12 platform > Checking if airbyte/destination-mssql:0.2.0 exists...
2024-01-24 18:21:12 platform > airbyte/destination-mssql:0.2.0 was found locally.
2024-01-24 18:21:12 platform > Creating docker container = destination-mssql-check-384-9-xplwa with resources io.airbyte.config.ResourceRequirements@1395a209[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-01-24 18:21:12 platform > Preparing command: docker run --rm --init -i -w /data/384/9 --log-driver none --name destination-mssql-check-384-9-xplwa --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-mssql:0.2.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=9 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.44 -e WORKER_JOB_ID=384 airbyte/destination-mssql:0.2.0 check --config source_config.json
2024-01-24 18:21:12 platform > Reading messages from protocol version 0.2.0
2024-01-24 18:21:15 platform > INFO i.a.i.d.m.MSSQLDestination(main):102 starting destination: class io.airbyte.integrations.destination.mssql.MSSQLDestination
2024-01-24 18:21:15 platform > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}
2024-01-24 18:21:15 platform > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.base.ssh.SshWrappedDestination
2024-01-24 18:21:15 platform > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: CHECK
2024-01-24 18:21:15 platform > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2024-01-24 18:21:15 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:15 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:16 platform > INFO i.a.i.b.s.SshTunnel(getInstance):204 Starting connection with method: NO_TUNNEL
2024-01-24 18:21:16 platform > INFO c.z.h.HikariDataSource(<init>):80 HikariPool-1 - Starting...
2024-01-24 18:21:16 platform > INFO c.z.h.HikariDataSource(<init>):82 HikariPool-1 - Start completed.
2024-01-24 18:21:17 platform > INFO c.z.h.HikariDataSource(close):350 HikariPool-1 - Shutdown initiated...
2024-01-24 18:21:17 platform > INFO c.z.h.HikariDataSource(close):352 HikariPool-1 - Shutdown completed.
2024-01-24 18:21:17 platform > INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.base.ssh.SshWrappedDestination
2024-01-24 18:21:17 platform > INFO i.a.i.d.m.MSSQLDestination(main):104 completed destination: class io.airbyte.integrations.destination.mssql.MSSQLDestination
2024-01-24 18:21:17 platform > Destination process done (exit code 0)
2024-01-24 18:21:17 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@132a5bbd[status=succeeded,message=<null>,additionalProperties={}]
2024-01-24 18:21:17 platform > 
2024-01-24 18:21:17 platform > ----- END CHECK -----
2024-01-24 18:21:17 platform > 
2024-01-24 18:21:18 platform > Docker volume job log path: /tmp/workspace/384/9/logs.log
2024-01-24 18:21:18 platform > Executing worker wrapper. Airbyte version: 0.50.44
2024-01-24 18:21:18 platform > Attempt 0 to save workflow id for cancellation
2024-01-24 18:21:18 platform > start sync worker. job id: 384 attempt id: 9
2024-01-24 18:21:18 platform > 
2024-01-24 18:21:18 platform > ----- START REPLICATION -----
2024-01-24 18:21:18 platform > 
2024-01-24 18:21:18 platform > Running destination...
2024-01-24 18:21:18 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:18 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:18 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:18 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:18 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:18 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-01-24 18:21:18 platform > Using default value for environment variable LAUNCHDARKLY_KEY: ''
2024-01-24 18:21:18 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:18 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-01-24 18:21:18 platform > Using default value for environment variable LAUNCHDARKLY_KEY: ''
2024-01-24 18:21:18 platform > Checking if airbyte/destination-mssql:0.2.0 exists...
2024-01-24 18:21:18 platform > Checking if airbyte/source-snowflake:0.2.0 exists...
2024-01-24 18:21:18 platform > airbyte/source-snowflake:0.2.0 was found locally.
2024-01-24 18:21:18 platform > Creating docker container = source-snowflake-read-384-9-agiye with resources io.airbyte.config.ResourceRequirements@2478dbd4[cpuRequest=1,cpuLimit=2,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@13e7735c[hosts=[jcxbjjc-prod_5fc921f3695d4_sf.snowflakecomputing.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-01-24 18:21:18 platform > Preparing command: docker run --rm --init -i -w /data/384/9 --log-driver none --name source-snowflake-read-384-9-agiye -e CONCURRENT_SOURCE_STREAM_READ=false --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-snowflake:0.2.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=9 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.44 -e WORKER_JOB_ID=384 --cpus=2 --memory-reservation=1Gi --memory=2Gi airbyte/source-snowflake:0.2.0 read --config source_config.json --catalog source_catalog.json
2024-01-24 18:21:18 platform > Reading messages from protocol version 0.2.0
2024-01-24 18:21:18 platform > airbyte/destination-mssql:0.2.0 was found locally.
2024-01-24 18:21:18 platform > Creating docker container = destination-mssql-write-384-9-xygca with resources io.airbyte.config.ResourceRequirements@4c594721[cpuRequest=1,cpuLimit=2,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts null
2024-01-24 18:21:18 platform > Preparing command: docker run --rm --init -i -w /data/384/9 --log-driver none --name destination-mssql-write-384-9-xygca --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-mssql:0.2.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=9 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.44 -e WORKER_JOB_ID=384 --cpus=2 --memory-reservation=1Gi --memory=2Gi airbyte/destination-mssql:0.2.0 write --config destination_config.json --catalog destination_catalog.json
2024-01-24 18:21:18 platform > Writing messages to protocol version 0.2.0
2024-01-24 18:21:18 platform > Reading messages from protocol version 0.2.0
2024-01-24 18:21:18 platform > readFromSource: start
2024-01-24 18:21:18 platform > processMessage: start
2024-01-24 18:21:18 platform > writeToDestination: start
2024-01-24 18:21:18 platform > Starting source heartbeat check. Will check every 1 minutes.
2024-01-24 18:21:18 platform > readFromDestination: start
2024-01-24 18:21:21 source > INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 Running source under deployment mode: OSS
2024-01-24 18:21:21 source > INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 Starting source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:21 destination > INFO i.a.i.d.m.MSSQLDestination(main):102 starting destination: class io.airbyte.integrations.destination.mssql.MSSQLDestination
2024-01-24 18:21:21 source > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {read=null, catalog=source_catalog.json, config=source_config.json}
2024-01-24 18:21:21 source > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:21 source > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: READ
2024-01-24 18:21:21 source > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='null'}
2024-01-24 18:21:22 destination > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2024-01-24 18:21:22 destination > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.base.ssh.SshWrappedDestination
2024-01-24 18:21:22 destination > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: WRITE
2024-01-24 18:21:22 destination > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2024-01-24 18:21:22 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:22 source > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:22 source > INFO i.a.i.s.r.s.StateManagerFactory(createStateManager):48 Legacy state manager selected to manage state object with type LEGACY.
2024-01-24 18:21:22 source > INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):182 No cursor field set in catalog but not present in state. Stream: CDM_LMS_COURSE, New Cursor Field: null. Resetting cursor value
2024-01-24 18:21:22 source > INFO i.a.i.s.r.CdcStateManager(<init>):29 Initialized CDC state with: null
2024-01-24 18:21:22 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:22 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-01-24 18:21:23 source > INFO i.a.i.s.s.SnowflakeSource(buildUsernamePasswordConfig):121 jdbc:snowflake://jcxbjjc-prod_5fc921f3695d4_sf.snowflakecomputing.com/?role=BBDATA_USER_ROLE&warehouse=BLACKBOARD_DATA_WH&database=BLACKBOARD_DATA_3FE097406ED44FEEADC552D05E1F8A7E&JDBC_QUERY_RESULT_FORMAT=JSON&CLIENT_SESSION_KEEP_ALIVE=true&application=airbyte_oss
2024-01-24 18:21:23 source > INFO i.a.i.s.s.SnowflakeDataSourceUtils(createDataSource):77 Authorization mode is 'Username and password'
2024-01-24 18:21:23 source > INFO c.z.h.HikariDataSource(getConnection):110 HikariPool-1 - Starting...
2024-01-24 18:21:23 destination > INFO i.a.i.b.s.SshTunnel(getInstance):204 Starting connection with method: NO_TUNNEL
2024-01-24 18:21:23 destination > INFO c.z.h.HikariDataSource(<init>):80 HikariPool-1 - Starting...
2024-01-24 18:21:24 destination > INFO c.z.h.HikariDataSource(<init>):82 HikariPool-1 - Start completed.
2024-01-24 18:21:24 destination > INFO i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$toWriteConfig$0):103 Write config: WriteConfig{streamName=COURSE, namespace=null, outputSchemaName=dbo, tmpTableName=_airbyte_tmp_fha_COURSE, outputTableName=_airbyte_raw_COURSE, syncMode=overwrite}
2024-01-24 18:21:24 destination > INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):173 class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2024-01-24 18:21:24 destination > INFO i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):142 Preparing raw tables in destination started for 1 streams
2024-01-24 18:21:24 destination > INFO i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):147 Preparing raw table in destination started for stream COURSE. schema: dbo, table name: _airbyte_raw_COURSE
2024-01-24 18:21:25 destination > INFO i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):160 Preparing raw tables in destination completed.
2024-01-24 18:21:26 source > INFO c.z.h.p.HikariPool(checkFailFast):565 HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@51b01550
2024-01-24 18:21:26 source > INFO c.z.h.HikariDataSource(getConnection):123 HikariPool-1 - Start completed.
2024-01-24 18:21:26 source > INFO i.a.i.s.j.AbstractJdbcSource(logPreSyncDebugData):450 Data source product recognized as Snowflake:8.3.1
2024-01-24 18:21:26 source > INFO i.a.i.s.j.AbstractJdbcSource(discoverInternal):166 Internal schemas to exclude: [INFORMATION_SCHEMA]
2024-01-24 18:21:28 source > INFO i.a.i.s.j.AbstractJdbcSource(queryTableFullRefresh):112 Queueing query for table: COURSE
2024-01-24 18:21:28 source > INFO i.a.i.s.r.RelationalDbQueryUtils(lambda$queryTable$0):73 Queueing query: SELECT "ID","INSTANCE_ID","COURSE_PARENT_ID","TERM_ID","SOURCE_ID","NAME","DESCRIPTION","START_DATE","END_DATE","COURSE_NUMBER","DESIGN_MODE","DESIGN_MODE_SOURCE_CODE","DESIGN_MODE_SOURCE_DESC","ENROLLMENT_METHOD","ENROLLMENT_METHOD_SOURCE_CODE","ENROLLMENT_METHOD_SOURCE_DESC","AVAILABLE_IND","AVAILABLE_TO_STUDENTS_IND","ENABLED_IND","STAGE","ROW_INSERTED_TIME","ROW_UPDATED_TIME","ROW_DELETED_TIME","MODIFIED_TIME","CREATED_TIME","START_TIME","END_TIME","COPY_FROM_COURSES" FROM "CDM_LMS"."COURSE"
2024-01-24 18:21:29 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):31 Set initial fetch size: 10 rows
2024-01-24 18:21:29 source > INFO i.a.c.u.CompositeIterator(lambda$emitStartStreamStatus$1):155 STARTING -> CDM_LMS_COURSE
2024-01-24 18:21:29 source > INFO i.a.c.u.CompositeIterator(lambda$emitRunningStreamStatus$0):148 RUNNING -> CDM_LMS_COURSE
2024-01-24 18:21:29 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 110052 rows
2024-01-24 18:21:29 source > INFO i.a.d.j.s.TwoStageSizeEstimator(getTargetBufferByteSize):71 Max memory limit: 1610612736, JDBC buffer size: 966367642
2024-01-24 18:21:30 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 390452 rows
2024-01-24 18:21:31 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 383023 rows
2024-01-24 18:21:31 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 381662 rows
2024-01-24 18:21:32 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 366882 rows
2024-01-24 18:21:32 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 362751 rows
2024-01-24 18:21:33 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 336948 rows
2024-01-24 18:21:34 platform > Records read: 5000 (3 MB)
2024-01-24 18:21:36 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 303032 rows
2024-01-24 18:21:36 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 299928 rows
2024-01-24 18:21:36 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):85 Flushing COURSE: 7831 records (24 MB)
2024-01-24 18:21:36 source > INFO i.a.i.s.r.AbstractDbSource(lambda$createReadIterator$5):422 Reading stream COURSE. Records read: 10000
2024-01-24 18:21:40 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):91 Flushing completed for COURSE
2024-01-24 18:21:40 platform > Records read: 10000 (8 MB)
2024-01-24 18:21:40 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 299649 rows
2024-01-24 18:21:40 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 297710 rows
2024-01-24 18:21:41 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 295525 rows
2024-01-24 18:21:41 platform > Records read: 15000 (12 MB)
2024-01-24 18:21:41 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):85 Flushing COURSE: 7135 records (24 MB)
2024-01-24 18:21:41 source > INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(accept):40 Set new fetch size: 265778 rows
2024-01-24 18:21:44 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):91 Flushing completed for COURSE
2024-01-24 18:21:44 source > INFO i.a.c.u.CompositeIterator(lambda$emitCompleteStreamStatus$2):162 COMPLETE -> CDM_LMS_COURSE
2024-01-24 18:21:44 source > WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):278 The main thread is exiting while children non-daemon threads from a connector are still active.
Ideally, this situation should not happen...
Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead.
The main thread is: main (RUNNABLE)
 Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610)
        at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:309)
        at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:282)
        at io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:199)
        at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:140)
        at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99)
        at io.airbyte.integrations.base.adaptive.AdaptiveSourceRunner$Runner.run(AdaptiveSourceRunner.java:86)
        at io.airbyte.integrations.source.snowflake.SnowflakeSourceRunner.main(SnowflakeSourceRunner.java:19)
2024-01-24 18:21:44 source > WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):290 Active non-daemon thread: pool-5-thread-2 (RUNNABLE)
 Thread stacktrace: java.base@17.0.4.1/sun.nio.ch.Net.poll(Native Method)
        at java.base@17.0.4.1/sun.nio.ch.NioSocketImpl.park(NioSocketImpl.java:181)
        at java.base@17.0.4.1/sun.nio.ch.NioSocketImpl.timedRead(NioSocketImpl.java:285)
        at java.base@17.0.4.1/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:309)
        at java.base@17.0.4.1/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:350)
        at java.base@17.0.4.1/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:803)
        at java.base@17.0.4.1/java.net.Socket$SocketInputStream.read(Socket.java:966)
        at java.base@17.0.4.1/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:478)
        at java.base@17.0.4.1/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:472)
        at java.base@17.0.4.1/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70)
        at java.base@17.0.4.1/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1460)
        at java.base@17.0.4.1/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1064)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
        at app//net.snowflake.client.jdbc.internal.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
        at app//net.snowflake.client.jdbc.internal.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
        at app//net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
        at app//net.snowflake.client.jdbc.RestRequest.execute(RestRequest.java:174)
        at app//net.snowflake.client.core.HttpUtil.executeRequestInternal(HttpUtil.java:737)
        at app//net.snowflake.client.core.HttpUtil.executeRequest(HttpUtil.java:672)
        at app//net.snowflake.client.core.HttpUtil.executeGeneralRequest(HttpUtil.java:594)
        at app//net.snowflake.client.jdbc.telemetry.TelemetryClient.sendBatch(TelemetryClient.java:336)
        at app//net.snowflake.client.jdbc.telemetry.TelemetryClient.lambda$sendBatchAsync$0(TelemetryClient.java:267)
        at app//net.snowflake.client.jdbc.telemetry.TelemetryClient$$Lambda$283/0x000000080100e440.call(Unknown Source)
        at java.base@17.0.4.1/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base@17.0.4.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at java.base@17.0.4.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@17.0.4.1/java.lang.Thread.run(Thread.java:833)
2024-01-24 18:21:44 source > INFO i.a.i.s.r.AbstractDbSource(lambda$read$1):173 Closing database connection pool.
2024-01-24 18:21:44 source > INFO c.z.h.HikariDataSource(close):350 HikariPool-1 - Shutdown initiated...
2024-01-24 18:21:44 source > INFO c.z.h.HikariDataSource(close):352 HikariPool-1 - Shutdown completed.
2024-01-24 18:21:44 source > INFO i.a.i.s.r.AbstractDbSource(lambda$read$1):175 Closed database connection pool.
2024-01-24 18:21:44 source > INFO i.a.i.b.IntegrationRunner(runInternal):195 Completed integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:44 source > INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):87 Completed source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-01-24 18:21:44 platform > Total records read: 18221 (15 MB)
2024-01-24 18:21:44 platform > Schema validation was performed to a max of 10 records with errors per stream.
2024-01-24 18:21:44 platform > readFromSource: done. (source.isFinished:true, fromSource.isClosed:false)
2024-01-24 18:21:44 platform > thread status... heartbeat thread: false , replication thread: true
2024-01-24 18:21:44 platform > processMessage: done. (fromSource.isDone:true, forDest.isClosed:false)
2024-01-24 18:21:44 platform > writeToDestination: done. (forDest.isDone:true, isDestRunning:true)
2024-01-24 18:21:44 platform > thread status... timeout thread: false , replication thread: true
2024-01-24 18:21:44 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded.
2024-01-24 18:21:44 destination > INFO i.a.i.d.b.BufferedStreamConsumer(close):288 executing on success close procedure.
2024-01-24 18:21:44 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):85 Flushing COURSE: 3369 records (13 MB)
2024-01-24 18:21:46 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):91 Flushing completed for COURSE
2024-01-24 18:21:46 destination > INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.base.ssh.SshWrappedDestination
2024-01-24 18:21:46 destination > INFO i.a.i.d.m.MSSQLDestination(main):104 completed destination: class io.airbyte.integrations.destination.mssql.MSSQLDestination
2024-01-24 18:21:46 destination > Destination process done (exit code 0)
2024-01-24 18:21:46 destination > Skipping in-connector normalization
2024-01-24 18:21:46 platform > readFromDestination: done. (writeToDestFailed:false, dest.isFinished:true)
2024-01-24 18:21:46 platform > thread status... timeout thread: false , replication thread: true
2024-01-24 18:21:46 platform > sync summary: {
  "status" : "completed",
  "recordsSynced" : 0,
  "bytesSynced" : 0,
  "startTime" : 1706120478137,
  "endTime" : 1706120506593,
  "totalStats" : {
    "bytesCommitted" : 16643326,
    "bytesEmitted" : 16643326,
    "destinationStateMessagesEmitted" : 0,
    "destinationWriteEndTime" : 1706120506574,
    "destinationWriteStartTime" : 1706120478138,
    "meanSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "recordsEmitted" : 18335,
    "recordsCommitted" : 18335,
    "replicationEndTime" : 1706120506592,
    "replicationStartTime" : 1706120478137,
    "sourceReadEndTime" : 1706120504589,
    "sourceReadStartTime" : 1706120478139,
    "sourceStateMessagesEmitted" : 0
  },
  "streamStats" : [ {
    "streamName" : "COURSE",
    "streamNamespace" : "CDM_LMS",
    "stats" : {
      "bytesCommitted" : 16643326,
      "bytesEmitted" : 16643326,
      "recordsEmitted" : 18335,
      "recordsCommitted" : 18335
    }
  } ],
  "performanceMetrics" : {
    "processFromSource" : {
      "elapsedTimeInNanos" : 1232888986,
      "executionCount" : 18338,
      "avgExecTimeInNanos" : 67231.37670411168
    },
    "readFromSource" : {
      "elapsedTimeInNanos" : 19959283157,
      "executionCount" : 18706,
      "avgExecTimeInNanos" : 1066998.9926761468
    },
    "processFromDest" : {
      "elapsedTimeInNanos" : 0,
      "executionCount" : 0,
      "avgExecTimeInNanos" : "NaN"
    },
    "writeToDest" : {
      "elapsedTimeInNanos" : 8245955297,
      "executionCount" : 18335,
      "avgExecTimeInNanos" : 449738.49451868015
    },
    "readFromDest" : {
      "elapsedTimeInNanos" : 28310854251,
      "executionCount" : 173,
      "avgExecTimeInNanos" : 1.6364655636416185E8
    }
  }
}
2024-01-24 18:21:46 platform > failures: [ ]
2024-01-24 18:21:46 platform > 
2024-01-24 18:21:46 platform > ----- END REPLICATION -----
2024-01-24 18:21:46 platform > 
2024-01-24 18:21:46 platform > Docker volume job log path: /tmp/workspace/384/9/logs.log
2024-01-24 18:21:46 platform > Executing worker wrapper. Airbyte version: 0.50.44
2024-01-24 18:21:46 platform > Attempt 0 to save workflow id for cancellation
2024-01-24 18:21:46 platform > 
2024-01-24 18:21:46 platform > ----- START DEFAULT NORMALIZATION -----
2024-01-24 18:21:46 platform > 
2024-01-24 18:21:46 platform > Running with normalization version: airbyte/normalization-mssql:0.4.3
2024-01-24 18:21:46 platform > Checking if airbyte/normalization-mssql:0.4.3 exists...
2024-01-24 18:21:46 platform > airbyte/normalization-mssql:0.4.3 was found locally.
2024-01-24 18:21:46 platform > Creating docker container = normalization-mssql-normalize-384-9-doqcw with resources io.airbyte.config.ResourceRequirements@5cfe1e3b[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-01-24 18:21:46 platform > Preparing command: docker run --rm --init -i -w /data/384/9/normalize --log-driver none --name normalization-mssql-normalize-384-9-doqcw --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.50.44 airbyte/normalization-mssql:0.4.3 run --integration-type mssql --config destination_config.json --catalog destination_catalog.json
2024-01-24 18:21:47 normalization > Running: transform-config --config destination_config.json --integration-type mssql --out /data/384/9/normalize
2024-01-24 18:21:48 normalization > Namespace(config='destination_config.json', integration_type=<DestinationType.MSSQL: 'mssql'>, out='/data/384/9/normalize')
2024-01-24 18:21:48 normalization > transform_mssql
2024-01-24 18:21:48 normalization > Running: transform-catalog --integration-type mssql --profile-config-dir /data/384/9/normalize --catalog destination_catalog.json --out /data/384/9/normalize/models/generated/ --json-column _airbyte_data
2024-01-24 18:21:49 normalization > Processing destination_catalog.json...
2024-01-24 18:21:49 normalization >   Generating airbyte_ctes/dbo/course_ab1.sql from COURSE
2024-01-24 18:21:49 normalization >   Generating airbyte_ctes/dbo/course_ab2.sql from COURSE
2024-01-24 18:21:49 normalization >   Generating airbyte_ctes/dbo/course_ab3.sql from COURSE
2024-01-24 18:21:49 normalization >   Adding drop table hook for course_scd to course
2024-01-24 18:21:49 normalization >   Generating airbyte_tables/dbo/course.sql from COURSE
2024-01-24 18:21:49 normalization > detected no config file for ssh, assuming ssh is off.
2024-01-24 18:21:55 normalization >            [--event-buffer-size EVENT_BUFFER_SIZE]
2024-01-24 18:21:55 normalization >   --event-buffer-size EVENT_BUFFER_SIZE
2024-01-24 18:21:55 platform > 
2024-01-24 18:21:55 normalization > DBT >=1.0.0 detected; using 10K event buffer size
2024-01-24 18:21:55 platform > 
2024-01-24 18:22:00 normalization > Running with dbt=1.0.0
2024-01-24 18:22:00 normalization > Partial parse save file not found. Starting full parse.
2024-01-24 18:22:04 normalization > [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
There are 2 unused configuration paths:
- models.airbyte_utils.generated.airbyte_views
- models.airbyte_utils.generated.airbyte_incremental

2024-01-24 18:22:04 normalization > Found 4 models, 0 tests, 0 snapshots, 0 analyses, 621 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics
2024-01-24 18:22:04 normalization > Encountered an error:
Database Error
  ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'SQLReporter'. (18456) (SQLDriverConnect)")
2024-01-24 18:22:04 platform > 
2024-01-24 18:22:04 normalization > Diagnosing dbt debug to check if destination is available for dbt and well configured (1):
2024-01-24 18:22:04 platform > 
2024-01-24 18:22:10 normalization > 18:22:10  Running with dbt=1.0.0
2024-01-24 18:22:10 normalization > dbt version: 1.0.0
2024-01-24 18:22:10 normalization > python version: 3.9.9
2024-01-24 18:22:10 normalization > python path: /usr/local/bin/python
2024-01-24 18:22:10 normalization > os info: Linux-5.15.0-91-generic-x86_64-with-glibc2.31
2024-01-24 18:22:10 normalization > Using profiles.yml file at /data/384/9/normalize/profiles.yml
2024-01-24 18:22:10 normalization > Using dbt_project.yml file at /data/384/9/normalize/dbt_project.yml
2024-01-24 18:22:10 platform > 
2024-01-24 18:22:11 normalization > Configuration:
2024-01-24 18:22:11 normalization >   profiles.yml file [OK found and valid]
2024-01-24 18:22:11 normalization >   dbt_project.yml file [OK found and valid]
2024-01-24 18:22:11 platform > 
2024-01-24 18:22:11 normalization > Required dependencies:
2024-01-24 18:22:11 normalization >  - git [OK found]
2024-01-24 18:22:11 platform > 
2024-01-24 18:22:11 normalization > Connection:
2024-01-24 18:22:11 normalization >   server: 192.168.0.25
2024-01-24 18:22:11 normalization >   database: DataHub
2024-01-24 18:22:11 normalization >   schema: dbo
2024-01-24 18:22:11 normalization >   port: 1433
2024-01-24 18:22:11 normalization >   UID: SQLReporter
2024-01-24 18:22:11 normalization >   client_id: None
2024-01-24 18:22:11 normalization >   authentication: sql
2024-01-24 18:22:11 normalization >   encrypt: False
2024-01-24 18:22:11 normalization >   trust_cert: False
2024-01-24 18:22:11 normalization >   Connection test: [ERROR]
2024-01-24 18:22:11 platform > 
2024-01-24 18:22:11 platform > 1
2024-01-24 18:22:11 normalization > dbt was unable to connect to the specified database.
2024-01-24 18:22:11 normalization > The database returned the following error:
2024-01-24 18:22:11 platform > 
2024-01-24 18:22:11 normalization >   >Database Error
2024-01-24 18:22:11 normalization >   ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'SQLReporter'. (18456) (SQLDriverConnect)")
2024-01-24 18:22:11 platform > 
2024-01-24 18:22:11 normalization > Check your database credentials and try again. For more information, visit:
2024-01-24 18:22:11 normalization > https://docs.getdbt.com/docs/configure-your-profile
2024-01-24 18:22:11 platform > 
2024-01-24 18:22:11 platform > 
2024-01-24 18:22:12 platform > Terminating normalization process...
2024-01-24 18:22:12 platform > Normalization failed for job 384.
io.airbyte.workers.exception.WorkerException: Normalization process did not terminate normally (exit code: 1)
    at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:186) ~[io.airbyte-airbyte-commons-worker-0.50.44.jar:?]
    at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:89) ~[io.airbyte-airbyte-commons-worker-0.50.44.jar:?]
    at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:38) ~[io.airbyte-airbyte-commons-worker-0.50.44.jar:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:142) ~[io.airbyte-airbyte-workers-0.50.44.jar:?]
    at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$2(NormalizationActivityImpl.java:194) ~[io.airbyte-airbyte-workers-0.50.44.jar:?]
    at io.airbyte.commons.temporal.HeartbeatUtils.withBackgroundHeartbeat(HeartbeatUtils.java:57) ~[io.airbyte-airbyte-commons-temporal-core-0.50.44.jar:?]
    at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.normalize(NormalizationActivityImpl.java:144) ~[io.airbyte-airbyte-workers-0.50.44.jar:?]
    at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) ~[?:?]
    at java.base/java.lang.reflect.Method.invoke(Method.java:580) ~[?:?]
    at io.temporal.internal.activity.RootActivityInboundCallsInterceptor$POJOActivityInboundCallsInterceptor.executeActivity(RootActivityInboundCallsInterceptor.java:64) ~[temporal-sdk-1.22.3.jar:?]
    at io.temporal.internal.activity.RootActivityInboundCallsInterceptor.execute(RootActivityInboundCallsInterceptor.java:43) ~[temporal-sdk-1.22.3.jar:?]
    at io.temporal.internal.activity.ActivityTaskExecutors$BaseActivityTaskExecutor.execute(ActivityTaskExecutors.java:107) ~[temporal-sdk-1.22.3.jar:?]
    at io.temporal.internal.activity.ActivityTaskHandlerImpl.handle(ActivityTaskHandlerImpl.java:124) ~[temporal-sdk-1.22.3.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:278) ~[temporal-sdk-1.22.3.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:243) ~[temporal-sdk-1.22.3.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:216) ~[temporal-sdk-1.22.3.jar:?]
    at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:105) ~[temporal-sdk-1.22.3.jar:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
2024-01-24 18:22:12 platform > Normalization executed in 25 seconds for job 384.
2024-01-24 18:22:12 platform > Normalization summary: io.airbyte.config.NormalizationSummary@59ec1db4[startTime=1706120506766,endTime=1706120532116,failures=[io.airbyte.config.FailureReason@2a471614[failureOrigin=normalization,failureType=system_error,internalMessage=('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'SQLReporter'. (18456) (SQLDriverConnect)"),externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@663f2c28[additionalProperties={attemptNumber=9, jobId=384, from_trace_message=true}],stacktrace=AirbyteDbtError: 
Encountered an error:
Database Error
  ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'SQLReporter'. (18456) (SQLDriverConnect)"),retryable=<null>,timestamp=1706120532111,additionalProperties={}], io.airbyte.config.FailureReason@6dace2c5[failureOrigin=normalization,failureType=system_error,internalMessage=('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'SQLReporter'. (18456) (SQLDriverConnect)"),externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@54945379[additionalProperties={attemptNumber=9, jobId=384, from_trace_message=true}],stacktrace=AirbyteDbtError: 
Encountered an error:
Database Error
  ('28000', "[28000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Login failed for user 'SQLReporter'. (18456) (SQLDriverConnect)"),retryable=<null>,timestamp=1706120532111,additionalProperties={}]],additionalProperties={}]
2024-01-24 18:22:12 platform > 
2024-01-24 18:22:12 platform > ----- END DEFAULT NORMALIZATION -----
2024-01-24 18:22:12 platform > 
2024-01-24 18:22:12 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=10, successiveCompleteFailures=0, totalCompleteFailures=0, successivePartialFailures=10, totalPartialFailures=10)
 Backoff before next attempt: 0 seconds
2024-01-24 18:22:12 platform > Failing job: 384, reason: Job failed after too many retries for connection 4a254d17-6a84-42e2-ae62-cae0de8bec54

Contribute

marcosmarxm commented 9 months ago

Hello @ibrentlam thanks for reporting the issue. At the moment destination connector team is working to release the MSSQL Destination V2 which removes the normalization module from the connector. I'm going to add this to the team backlog for future reference when they make the changes.