airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
15.53k stars 4k forks source link

[platform] You must upgrade your platform version to use this connector version (latest version already) #39900

Closed erwamartin closed 2 months ago

erwamartin commented 3 months ago

Connector Name

destination-bigquery

Connector Version

2.7.1

What step the error happened?

During the sync

Relevant information

I'm getting the following error while syncing Intercom to Bigquery:

Failure in destination: You must upgrade your platform version to use this connector version. Either downgrade your connector or upgrade platform to 0.63.0

AirByte version is 0.63.1.

Relevant log output

>> ATTEMPT 1/1

2024-06-20 14:46:07 platform > Docker volume job log path: /tmp/workspace/1390/0/logs.log
2024-06-20 14:46:07 platform > Executing worker wrapper. Airbyte version: 0.63.1
2024-06-20 14:46:07 platform > 
2024-06-20 14:46:07 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:07 platform > ----- START CHECK -----
2024-06-20 14:46:07 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:07 platform > 
2024-06-20 14:46:07 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:07 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:07 platform > Checking if airbyte/source-intercom:0.6.7 exists...
2024-06-20 14:46:07 platform > airbyte/source-intercom:0.6.7 was found locally.
2024-06-20 14:46:07 platform > Creating docker container = source-intercom-check-1390-0-nnflg with resources io.airbyte.config.ResourceRequirements@4a13305e[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@3aab543a[hosts=[api.intercom.io, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-06-20 14:46:07 platform > Preparing command: docker run --rm --init -i -w /data/1390/0 --log-driver none --name source-intercom-check-1390-0-nnflg --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-intercom:0.6.7 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.63.1 -e WORKER_JOB_ID=1390 airbyte/source-intercom:0.6.7 check --config source_config.json
2024-06-20 14:46:07 platform > Reading messages from protocol version 0.2.0
2024-06-20 14:46:10 platform > Check succeeded
2024-06-20 14:46:10 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@f896fa6[status=succeeded,message=<null>,additionalProperties={}]
2024-06-20 14:46:10 platform > 
2024-06-20 14:46:10 platform > ----- END CHECK -----
2024-06-20 14:46:10 platform > 
2024-06-20 14:46:11 platform > Docker volume job log path: /tmp/workspace/1390/0/logs.log
2024-06-20 14:46:11 platform > Executing worker wrapper. Airbyte version: 0.63.1
2024-06-20 14:46:11 platform > 
2024-06-20 14:46:11 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:11 platform > ----- START CHECK -----
2024-06-20 14:46:11 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:11 platform > 
2024-06-20 14:46:11 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:11 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:11 platform > Checking if airbyte/destination-bigquery:2.7.1 exists...
2024-06-20 14:46:11 platform > airbyte/destination-bigquery:2.7.1 was found locally.
2024-06-20 14:46:11 platform > Creating docker container = destination-bigquery-check-1390-0-zwfjc with resources io.airbyte.config.ResourceRequirements@407584a1[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-06-20 14:46:11 platform > Preparing command: docker run --rm --init -i -w /data/1390/0 --log-driver none --name destination-bigquery-check-1390-0-zwfjc --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:2.7.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.63.1 -e WORKER_JOB_ID=1390 airbyte/destination-bigquery:2.7.1 check --config source_config.json
2024-06-20 14:46:11 platform > Reading messages from protocol version 0.2.0
2024-06-20 14:46:13 platform > INFO main i.a.i.d.b.BigQueryDestinationKt(main):388 Starting Destination : class io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-06-20 14:46:13 platform > INFO main i.a.c.i.b.IntegrationCliParser$Companion(parseOptions):144 integration args: {check=null, config=source_config.json}
2024-06-20 14:46:13 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):124 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-06-20 14:46:13 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):125 Command: CHECK
2024-06-20 14:46:13 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):126 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2024-06-20 14:46:13 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword groups - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:13 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword group - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:13 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:13 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword display_type - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:13 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:13 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:14 platform > INFO main i.a.i.d.b.BigQueryUtils(getLoadingMethod):294 Selected loading method is set to: GCS
2024-06-20 14:46:17 platform > INFO main i.a.c.i.d.s.UploadFormatConfigFactory(getUploadFormatConfig):20 File upload format config: {"format_type":"CSV","flattening":"No flattening"}
2024-06-20 14:46:17 platform > INFO main i.a.c.i.d.s.S3BaseChecks(testSingleUpload):38 Started testing if all required credentials assigned to user for single file uploading
2024-06-20 14:46:18 platform > INFO main i.a.c.i.d.s.S3BaseChecks(testSingleUpload):48 Finished checking for normal upload mode
2024-06-20 14:46:18 platform > INFO main i.a.c.i.d.s.S3BaseChecks(testMultipartUpload):54 Started testing if all required credentials assigned to user for multipart upload
2024-06-20 14:46:18 platform > INFO main a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 Initiated multipart upload to airbyte_intercom_bigquery/data/test_1718894778217 with full ID ABPnzm64zcNAwTAHtEhObAYL594uFIpkvtoZSKWCYFwRDZxrhfwWj9kIFYvdq1C8RsTxKQs
2024-06-20 14:46:18 platform > INFO main a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000]
2024-06-20 14:46:18 platform > INFO main a.m.s.MultiPartOutputStream(close):158 Called close() on [MultipartOutputStream for parts 1 - 10000]
2024-06-20 14:46:18 platform > WARN main a.m.s.MultiPartOutputStream(close):160 [MultipartOutputStream for parts 1 - 10000] is already closed
2024-06-20 14:46:18 platform > INFO main a.m.s.StreamTransferManager(complete):367 [Manager uploading to airbyte_intercom_bigquery/data/test_1718894778217 with id ABPnzm64z...C8RsTxKQs]: Uploading leftover stream [Part number 1 containing 3.34 MB]
2024-06-20 14:46:18 platform > INFO main a.m.s.StreamTransferManager(uploadStreamPart):560 [Manager uploading to airbyte_intercom_bigquery/data/test_1718894778217 with id ABPnzm64z...C8RsTxKQs]: Finished uploading [Part number 1 containing 3.34 MB]
2024-06-20 14:46:18 platform > INFO main a.m.s.StreamTransferManager(complete):397 [Manager uploading to airbyte_intercom_bigquery/data/test_1718894778217 with id ABPnzm64z...C8RsTxKQs]: Completed
2024-06-20 14:46:18 platform > INFO main i.a.c.i.d.s.S3BaseChecks(testMultipartUpload):86 Finished verification for multipart upload mode
2024-06-20 14:46:20 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):268 Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-06-20 14:46:20 platform > INFO main i.a.i.d.b.BigQueryDestinationKt(main):390 Completed Destination : class io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-06-20 14:46:20 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@209cdac0[status=succeeded,message=<null>,additionalProperties={}]
2024-06-20 14:46:20 platform > 
2024-06-20 14:46:20 platform > ----- END CHECK -----
2024-06-20 14:46:20 platform > 
2024-06-20 14:46:20 platform > Docker volume job log path: /tmp/workspace/1390/0/logs.log
2024-06-20 14:46:20 platform > Executing worker wrapper. Airbyte version: 0.63.1
2024-06-20 14:46:20 platform > start sync worker. job id: 1390 attempt id: 0
2024-06-20 14:46:20 platform > 
2024-06-20 14:46:20 platform > ----- START REPLICATION -----
2024-06-20 14:46:20 platform > 
2024-06-20 14:46:21 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:21 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:21 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:21 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:21 platform > Running destination...
2024-06-20 14:46:21 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:21 platform > Checking if airbyte/source-intercom:0.6.7 exists...
2024-06-20 14:46:21 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-06-20 14:46:21 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:21 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-06-20 14:46:21 platform > Checking if airbyte/destination-bigquery:2.7.1 exists...
2024-06-20 14:46:21 platform > airbyte/source-intercom:0.6.7 was found locally.
2024-06-20 14:46:21 platform > Creating docker container = source-intercom-read-1390-0-icqka with resources io.airbyte.config.ResourceRequirements@5779d94[cpuRequest=0.2,cpuLimit=1,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@2ce1b0[hosts=[api.intercom.io, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-06-20 14:46:21 platform > Preparing command: docker run --rm --init -i -w /data/1390/0 --log-driver none --name source-intercom-read-1390-0-icqka -e CONCURRENT_SOURCE_STREAM_READ=false --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-intercom:0.6.7 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.63.1 -e WORKER_JOB_ID=1390 --cpus=1 --memory-reservation=1Gi --memory=2Gi airbyte/source-intercom:0.6.7 read --config source_config.json --catalog source_catalog.json
2024-06-20 14:46:21 platform > Reading messages from protocol version 0.2.0
2024-06-20 14:46:21 platform > airbyte/destination-bigquery:2.7.1 was found locally.
2024-06-20 14:46:21 platform > Creating docker container = destination-bigquery-write-1390-0-qojva with resources io.airbyte.config.ResourceRequirements@63553178[cpuRequest=0.2,cpuLimit=1,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts null
2024-06-20 14:46:21 platform > Preparing command: docker run --rm --init -i -w /data/1390/0 --log-driver none --name destination-bigquery-write-1390-0-qojva --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:2.7.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.63.1 -e WORKER_JOB_ID=1390 --cpus=1 --memory-reservation=1Gi --memory=2Gi airbyte/destination-bigquery:2.7.1 write --config destination_config.json --catalog destination_catalog.json
2024-06-20 14:46:21 platform > Writing messages to protocol version 0.2.0
2024-06-20 14:46:21 platform > Reading messages from protocol version 0.2.0
2024-06-20 14:46:21 platform > readFromSource: start
2024-06-20 14:46:21 platform > Starting source heartbeat check. Will check threshold of 10800 seconds, every 1 minutes.
2024-06-20 14:46:21 platform > processMessage: start
2024-06-20 14:46:21 platform > writeToDestination: start
2024-06-20 14:46:21 platform > readFromDestination: start
2024-06-20 14:46:23 destination > INFO main i.a.i.d.b.BigQueryDestinationKt(main):388 Starting Destination : class io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-06-20 14:46:24 source > Starting syncing SourceIntercom
2024-06-20 14:46:24 source > Marking stream teams as STARTED
2024-06-20 14:46:24 source > Syncing stream: teams 
2024-06-20 14:46:24 platform > Stream status TRACE received of status: STARTED for stream teams
2024-06-20 14:46:24 destination > INFO main i.a.c.i.b.IntegrationCliParser$Companion(parseOptions):144 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2024-06-20 14:46:24 platform > Sending update for teams - null -> RUNNING
2024-06-20 14:46:24 platform > Stream Status Update Received: teams - RUNNING
2024-06-20 14:46:24 platform > Creating status: teams - RUNNING
2024-06-20 14:46:24 destination > INFO main i.a.c.i.b.IntegrationRunner(runInternal):124 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-06-20 14:46:24 destination > INFO main i.a.c.i.b.IntegrationRunner(runInternal):125 Command: WRITE
2024-06-20 14:46:24 destination > INFO main i.a.c.i.b.IntegrationRunner(runInternal):126 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2024-06-20 14:46:24 source > Read 0 records from teams stream
2024-06-20 14:46:24 source > Marking stream teams as STOPPED
2024-06-20 14:46:24 source > Finished syncing teams
2024-06-20 14:46:24 source > SourceIntercom runtimes:
Syncing stream teams 0:00:00.271045
2024-06-20 14:46:24 source > Marking stream companies as STARTED
2024-06-20 14:46:24 source > Syncing stream: companies 
2024-06-20 14:46:24 platform > Source state message checksum is valid for stream _teams.
2024-06-20 14:46:24 platform > Stream status TRACE received of status: COMPLETE for stream teams
2024-06-20 14:46:24 platform > Stream status TRACE received of status: STARTED for stream companies
2024-06-20 14:46:24 platform > Sending update for companies - null -> RUNNING
2024-06-20 14:46:24 platform > Stream Status Update Received: companies - RUNNING
2024-06-20 14:46:24 platform > Creating status: companies - RUNNING
2024-06-20 14:46:24 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword groups - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:24 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword group - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:24 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:24 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword display_type - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:24 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:24 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-06-20 14:46:24 destination > INFO main i.a.i.d.b.BigQueryUtils(getLoadingMethod):294 Selected loading method is set to: GCS
2024-06-20 14:46:24 platform > Source state message checksum is valid for stream _companies.
2024-06-20 14:46:24 source > Read 0 records from companies stream
2024-06-20 14:46:24 source > Marking stream companies as STOPPED
2024-06-20 14:46:24 platform > Stream status TRACE received of status: COMPLETE for stream companies
2024-06-20 14:46:24 source > Finished syncing companies
2024-06-20 14:46:24 source > SourceIntercom runtimes:
Syncing stream companies 0:00:00.405448
Syncing stream teams 0:00:00.271045
2024-06-20 14:46:24 source > Marking stream contacts as STARTED
2024-06-20 14:46:24 platform > Stream status TRACE received of status: STARTED for stream contacts
2024-06-20 14:46:24 platform > Sending update for contacts - null -> RUNNING
2024-06-20 14:46:24 platform > Stream Status Update Received: contacts - RUNNING
2024-06-20 14:46:24 source > Syncing stream: contacts 
2024-06-20 14:46:24 platform > Creating status: contacts - RUNNING
2024-06-20 14:46:25 platform > readFromDestination: exception caught
io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1
    at io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:493) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:235) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
2024-06-20 14:46:25 platform > readFromDestination: done. (writeToDestFailed:false, dest.isFinished:true)
2024-06-20 14:46:25 platform > writeToDestination: exception caught
java.io.IOException: Stream closed
    at java.base/java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:447) ~[?:?]
    at java.base/java.io.OutputStream.write(OutputStream.java:167) ~[?:?]
    at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) ~[?:?]
    at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) ~[?:?]
    at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:246) ~[?:?]
    at java.base/sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:412) ~[?:?]
    at java.base/sun.nio.cs.StreamEncoder.lockedFlush(StreamEncoder.java:214) ~[?:?]
    at java.base/sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:201) ~[?:?]
    at java.base/java.io.OutputStreamWriter.flush(OutputStreamWriter.java:262) ~[?:?]
    at java.base/java.io.BufferedWriter.implFlush(BufferedWriter.java:372) ~[?:?]
    at java.base/java.io.BufferedWriter.flush(BufferedWriter.java:359) ~[?:?]
    at io.airbyte.workers.internal.DefaultAirbyteMessageBufferedWriter.flush(DefaultAirbyteMessageBufferedWriter.java:31) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInputWithNoTimeoutMonitor(DefaultAirbyteDestination.java:155) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInput(DefaultAirbyteDestination.java:145) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:459) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:263) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
2024-06-20 14:46:25 platform > writeToDestination: done. (forDest.isDone:false, isDestRunning:false)
2024-06-20 14:46:25 platform > processMessage: done. (fromSource.isDone:false, forDest.isClosed:true)
2024-06-20 14:46:25 source > Marking stream contacts as RUNNING
2024-06-20 14:46:25 platform > readFromSource: exception caught
java.lang.IllegalStateException: Source process is still alive, cannot retrieve exit value.
    at com.google.common.base.Preconditions.checkState(Preconditions.java:515) ~[guava-33.1.0-jre.jar:?]
    at io.airbyte.workers.internal.DefaultAirbyteSource.getExitValue(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:375) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:242) ~[io.airbyte-airbyte-commons-worker-0.63.1.jar:?]
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
2024-06-20 14:46:25 platform > readFromSource: done. (source.isFinished:false, fromSource.isClosed:true)
2024-06-20 14:47:25 platform > airbyte-source gobbler IOException: Stream closed. Typically happens when cancelling a job.
2024-06-20 14:47:25 platform > sync summary: {
  "status" : "failed",
  "recordsSynced" : 0,
  "bytesSynced" : 0,
  "startTime" : 1718894780975,
  "endTime" : 1718894845783,
  "totalStats" : {
    "bytesCommitted" : 0,
    "bytesEmitted" : 0,
    "destinationStateMessagesEmitted" : 0,
    "destinationWriteEndTime" : 0,
    "destinationWriteStartTime" : 1718894780992,
    "meanSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "recordsEmitted" : 0,
    "recordsCommitted" : 0,
    "replicationEndTime" : 1718894845780,
    "replicationStartTime" : 1718894780975,
    "sourceReadEndTime" : 0,
    "sourceReadStartTime" : 1718894780993,
    "sourceStateMessagesEmitted" : 2
  },
  "streamStats" : [ {
    "streamName" : "teams",
    "stats" : {
      "bytesCommitted" : 0,
      "bytesEmitted" : 0,
      "recordsEmitted" : 0,
      "recordsCommitted" : 0
    }
  }, {
    "streamName" : "companies",
    "stats" : {
      "bytesCommitted" : 0,
      "bytesEmitted" : 0,
      "recordsEmitted" : 0,
      "recordsCommitted" : 0
    }
  } ],
  "performanceMetrics" : {
    "processFromSource" : {
      "elapsedTimeInNanos" : 98247836,
      "executionCount" : 7,
      "avgExecTimeInNanos" : 1.4035405142857144E7
    },
    "readFromSource" : {
      "elapsedTimeInNanos" : 4683057987,
      "executionCount" : 8,
      "avgExecTimeInNanos" : 5.85382248375E8
    },
    "processFromDest" : {
      "elapsedTimeInNanos" : 768992,
      "executionCount" : 1,
      "avgExecTimeInNanos" : 768992.0
    },
    "writeToDest" : {
      "elapsedTimeInNanos" : 34232646,
      "executionCount" : 2,
      "avgExecTimeInNanos" : 1.7116323E7
    },
    "readFromDest" : {
      "elapsedTimeInNanos" : 4342195280,
      "executionCount" : 37,
      "avgExecTimeInNanos" : 1.173566291891892E8
    }
  }
}
2024-06-20 14:47:25 platform > failures: [ {
  "failureOrigin" : "destination",
  "failureType" : "config_error",
  "internalMessage" : "io.airbyte.commons.exceptions.ConfigErrorException: You must upgrade your platform version to use this connector version. Either downgrade your connector or upgrade platform to 0.63.0",
  "externalMessage" : "You must upgrade your platform version to use this connector version. Either downgrade your connector or upgrade platform to 0.63.0",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 1390,
    "from_trace_message" : true,
    "connector_command" : "write"
  },
  "stacktrace" : "io.airbyte.commons.exceptions.ConfigErrorException: You must upgrade your platform version to use this connector version. Either downgrade your connector or upgrade platform to 0.63.0\n\tat io.airbyte.integrations.base.destination.typing_deduping.CatalogParser.toStreamConfig(CatalogParser.kt:139)\n\tat io.airbyte.integrations.base.destination.typing_deduping.CatalogParser.parseCatalog(CatalogParser.kt:56)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestination.parseCatalog(BigQueryDestination.kt:324)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestination.getSerializedMessageConsumer(BigQueryDestination.kt:221)\n\tat io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.kt:208)\n\tat io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.kt:116)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestinationKt.main(BigQueryDestination.kt:389)\n",
  "timestamp" : 1718894785266
}, {
  "failureOrigin" : "destination",
  "internalMessage" : "Destination process exited with non-zero exit code 1",
  "externalMessage" : "Something went wrong within the destination connector",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 1390,
    "connector_command" : "write"
  },
  "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:493)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:235)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
  "timestamp" : 1718894785432
}, {
  "failureOrigin" : "destination",
  "internalMessage" : "Destination process message delivery failed",
  "externalMessage" : "Something went wrong within the destination connector",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 1390,
    "connector_command" : "write"
  },
  "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process message delivery failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:465)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:263)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.io.IOException: Stream closed\n\tat java.base/java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:447)\n\tat java.base/java.io.OutputStream.write(OutputStream.java:167)\n\tat java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125)\n\tat java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252)\n\tat java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:246)\n\tat java.base/sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:412)\n\tat java.base/sun.nio.cs.StreamEncoder.lockedFlush(StreamEncoder.java:214)\n\tat java.base/sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:201)\n\tat java.base/java.io.OutputStreamWriter.flush(OutputStreamWriter.java:262)\n\tat java.base/java.io.BufferedWriter.implFlush(BufferedWriter.java:372)\n\tat java.base/java.io.BufferedWriter.flush(BufferedWriter.java:359)\n\tat io.airbyte.workers.internal.DefaultAirbyteMessageBufferedWriter.flush(DefaultAirbyteMessageBufferedWriter.java:31)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInputWithNoTimeoutMonitor(DefaultAirbyteDestination.java:155)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInput(DefaultAirbyteDestination.java:145)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:459)\n\t... 5 more\n",
  "timestamp" : 1718894785434
}, {
  "failureOrigin" : "source",
  "internalMessage" : "Source process read attempt failed",
  "externalMessage" : "Something went wrong within the source connector",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 1390,
    "connector_command" : "read"
  },
  "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process read attempt failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:389)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:242)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.lang.IllegalStateException: Source process is still alive, cannot retrieve exit value.\n\tat com.google.common.base.Preconditions.checkState(Preconditions.java:515)\n\tat io.airbyte.workers.internal.DefaultAirbyteSource.getExitValue(DefaultAirbyteSource.java:136)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:375)\n\t... 5 more\n",
  "timestamp" : 1718894785769
}, {
  "failureOrigin" : "replication",
  "internalMessage" : "java.io.IOException: Stream closed",
  "externalMessage" : "Something went wrong during replication",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 1390
  },
  "stacktrace" : "java.lang.RuntimeException: java.io.IOException: Stream closed\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:538)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:263)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.io.IOException: Stream closed\n\tat java.base/java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:447)\n\tat java.base/java.io.OutputStream.write(OutputStream.java:167)\n\tat java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125)\n\tat java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252)\n\tat java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:246)\n\tat java.base/sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:412)\n\tat java.base/sun.nio.cs.StreamEncoder.lockedFlush(StreamEncoder.java:214)\n\tat java.base/sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:201)\n\tat java.base/java.io.OutputStreamWriter.flush(OutputStreamWriter.java:262)\n\tat java.base/java.io.BufferedWriter.implFlush(BufferedWriter.java:372)\n\tat java.base/java.io.BufferedWriter.flush(BufferedWriter.java:359)\n\tat io.airbyte.workers.internal.DefaultAirbyteMessageBufferedWriter.flush(DefaultAirbyteMessageBufferedWriter.java:31)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInputWithNoTimeoutMonitor(DefaultAirbyteDestination.java:155)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInput(DefaultAirbyteDestination.java:145)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:170)\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:536)\n\t... 5 more\n",
  "timestamp" : 1718894785770
} ]
2024-06-20 14:47:25 platform > 
2024-06-20 14:47:25 platform > ----- END REPLICATION -----
2024-06-20 14:47:25 platform > 
2024-06-20 14:47:26 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=20, successiveCompleteFailures=1, totalCompleteFailures=1, successivePartialFailures=0, totalPartialFailures=0)
 Backoff before next attempt: 10 seconds
2024-06-20 14:47:26 platform > Failing job: 1390, reason: Connection Check Failed 666f66ee-40f1-4a83-a9de-7908acfe9319

Contribute

erwamartin commented 3 months ago

Note: I downgraded the destination connector to version 2.6.1 (my previous version) and it's working fine.

marcosmarxm commented 3 months ago

This is normal. To ensure the connector works properly, you must update both the connector and platform versions when new features are released. Closing as the error message give direction of what is needed to fix the problem.

erwamartin commented 3 months ago

This is normal. To ensure the connector works properly, you must update both the connector and platform versions when new features are released. Closing as the error message give direction of what is needed to fix the problem.

@marcosmarxm I'm sorry but I don't understand. My AirByte version is already set to 0.63.1, so why does this message appear?

nathmisaki commented 2 months ago

Same happening with Airbyte Platform version 0.63.2 and destination-bigquery connector version 2.8.0.

erwamartin commented 2 months ago

@marcosmarxm could you please re-open this issue?

marcosmarxm commented 2 months ago

@erwamartin my bad!!! I totally lose the platform version. I'll ask the engineering team to take a look

marcosmarxm commented 2 months ago

Same happening with Airbyte Platform version 0.63.2 and destination-bigquery connector version 2.8.0.

What source and connector version you're using?

benmoriceau commented 2 months ago

Hello @erwamartin, Could you help me to understand the sequence of events here? I am wondering who got updated first, did you bump the platform version before bumping the connector version or the opposite?

Thanks,

benmoriceau commented 2 months ago

@erwamartin could you run the following query on the airbyte DB? And let me know the result.

SELECT supports_refreshes 
FROM public.actor_definition_version
where docker_repository = 'airbyte/destination-bigquery' and docker_image_tag = '2.7.1';
erwamartin commented 2 months ago

Hi @benmoriceau, thank you for your answers!

Hello @erwamartin, Could you help me to understand the sequence of events here? I am wondering who got updated first, did you bump the platform version before bumping the connector version or the opposite?

Thanks,

I may have updated the connector first, and then the platform.

@erwamartin could you run the following query on the airbyte DB? And let me know the result.

SELECT supports_refreshes 
FROM public.actor_definition_version
where docker_repository = 'airbyte/destination-bigquery' and docker_image_tag = '2.7.1';

Here is the output (I downgraded the destination connector to version 2.6.1):

airbyte=# SELECT supports_refreshes FROM public.actor_definition_version where docker_repository = 'airbyte/destination-bigquery' and docker_image_tag = '2.7.1';
 supports_refreshes 
--------------------
 f
(1 row)
benmoriceau commented 2 months ago

Thanks, @erwamartin this is confirming the theory I had about the sequence of events which lead to this issue. I am preparing a fix and I will update the ticket when a new version of the platform which is fixing this is released.

benmoriceau commented 2 months ago

Hello @erwamartin, The implementation of the fix is taking a little bit more than expected. We would like to avoid doing a hack and switch to a more long term solution. Our plan is to re-import the actor definition when starting the airbyte platform. It would have set the right value in the definition table after you upgraded the airbyte platform.

In the meantime, could you run the following query. It will allow you to upgrade to the latest version of big query (or 2.7.1).

UPDATE public.actor_definition_version
SET supports_refreshes=true
WHERE docker_repository = 'airbyte/destination-bigquery' AND (docker_image_tag IN ('2.7.0', '2.7.1', '2.8.0', '2.8.1'));

Let me know if that works for you.

erwamartin commented 2 months ago

Hi @benmoriceau, I just ran the SQL update query, and upgraded the bigquery connector. It worked fine šŸŽ‰ Thank you for your help! šŸ™

benmoriceau commented 2 months ago

Great, I will close the ticket.