airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
14.74k stars 3.79k forks source link

[Source-Sendgrid] New Error since Feb 12th Update: "Error -3 while decompressing data: unknown compression method" #35501

Open blakels opened 4 months ago

blakels commented 4 months ago

Connector Name

Sendgrid

Connector Version

0.4.2

What step the error happened?

During the sync

Relevant information

Error related to reading streams based on unknown compression methods. This caused it to not exit properly and fail.

The error occurs when reading stream contacts, which is actually empty.

The sync error was resolved when I removed contacts from the Replication List. But returned when I added it back.

Note for reading logs: There are supposed to be records in the following streams (but the others have no records):

No records in streams:

A Relevant Log snippets:

2024-02-21 17:38:06 source > Encountered an exception while reading stream contacts Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 116, in read stream_is_available, reason = stream_instance.check_availability(logger, self) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py", line 211, in check_availability return self.availability_strategy.check_availability(self, logger, source) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 50, in check_availability get_first_record_for_slice(stream, stream_slice) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py", line 40, in get_first_record_for_slice return next(records_for_slice) File "/airbyte/integration_code/source_sendgrid/streams.py", line 215, in read_records for record in self.read_with_chunks(*self.download_data(url=url)): File "/airbyte/integration_code/source_sendgrid/streams.py", line 302, in download_data data_file.write(decompressor.decompress(chunk)) zlib.error: Error -3 while decompressing data: unknown compression method 2024-02-21 17:38:06 source > Marking stream contacts as STOPPED

Relevant log output

2024-02-21 17:32:22 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=5, successivePartialFailureLimit=1000, totalPartialFailureLimit=10, successiveCompleteFailures=4, totalCompleteFailures=4, successivePartialFailures=0, totalPartialFailures=0)
2024-02-21 17:32:22 platform > Backing off for: 4 minutes 30 seconds.
2024-02-21 17:37:18 platform > Cloud storage job log path: /workspace/8629198/4/logs.log
2024-02-21 17:37:18 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CLAIM — (workloadId = d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync) — (dataplaneId = prod-dataplane-gcp-us-west3-0)
2024-02-21 17:37:41 INFO i.m.r.Micronaut(lambda$start$2):98 - Startup completed in 8172ms. Server Running: http://orchestrator-repl-job-8629198-attempt-4:9000
2024-02-21 17:37:46 replication-orchestrator > Writing async status INITIALIZING for KubePodInfo[namespace=jobs, name=orchestrator-repl-job-8629198-attempt-4, mainContainerInfo=KubeContainerInfo[image=airbyte/container-orchestrator:dev-77aec3b6b3, pullPolicy=IfNotPresent]]...
2024-02-21 17:37:47 replication-orchestrator > sourceLauncherConfig is: io.airbyte.persistence.job.models.IntegrationLauncherConfig@a904e64[jobId=8629198,attemptId=4,connectionId=d40d3c71-53bc-4abe-bfb6-5b9c1344c345,workspaceId=b15ce41e-086b-431e-b024-07e299f23f6e,dockerImage=airbyte/source-sendgrid:0.4.2,normalizationDockerImage=<null>,supportsDbt=false,normalizationIntegrationType=<null>,protocolVersion=Version{version='0.2.0', major='0', minor='2', patch='0'},isCustomConnector=false,allowedHosts=io.airbyte.config.AllowedHosts@22d23c27[hosts=[api.sendgrid.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}],additionalEnvironmentVariables=<null>,additionalLabels={connection_id=d40d3c71-53bc-4abe-bfb6-5b9c1344c345, job_id=8629198, attempt_id=4, workspace_id=b15ce41e-086b-431e-b024-07e299f23f6e, airbyte=job-pod, mutex_key=d40d3c71-53bc-4abe-bfb6-5b9c1344c345, workload_id=d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync, auto_id=e27ad023-258c-49f3-8741-95176f47e9ff},additionalProperties={}]
2024-02-21 17:37:47 replication-orchestrator > Attempt 0 to get the source definition for feature flag checks
2024-02-21 17:37:47 replication-orchestrator > Attempt 0 to get the source definition
2024-02-21 17:37:48 replication-orchestrator > Concurrent stream read enabled? false
2024-02-21 17:37:48 replication-orchestrator > Setting up source...
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_MEMORY_LIMIT: '50Mi'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_MEMORY_REQUEST: '25Mi'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_MEMORY_LIMIT: '50Mi'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_MEMORY_REQUEST: '25Mi'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_MEMORY_LIMIT: '50Mi'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_MEMORY_REQUEST: '25Mi'
2024-02-21 17:37:48 replication-orchestrator > Setting up destination...
2024-02-21 17:37:48 replication-orchestrator > Setting up replication worker...
2024-02-21 17:37:48 replication-orchestrator > Running replication worker...
2024-02-21 17:37:48 replication-orchestrator > start sync worker. job id: 8629198 attempt id: 4
2024-02-21 17:37:48 replication-orchestrator > 
2024-02-21 17:37:48 replication-orchestrator > configured sync modes: {null.spam_reports=incremental - append_dedup, null.contacts=full_refresh - overwrite, null.bounces=incremental - append_dedup, null.campaigns=full_refresh - overwrite, null.unsubscribe_groups=full_refresh - overwrite, null.invalid_emails=incremental - append_dedup, null.global_suppressions=incremental - append_dedup, null.blocks=incremental - append_dedup}
2024-02-21 17:37:48 replication-orchestrator > ----- START REPLICATION -----
2024-02-21 17:37:48 replication-orchestrator > 
2024-02-21 17:37:48 replication-orchestrator > Running destination...
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: ''
2024-02-21 17:37:48 replication-orchestrator > Attempting to start pod = destination-google-sheets-write-8629198-4-zjzfb for airbyte/destination-google-sheets:0.2.3 with resources ConnectorResourceRequirements[main=io.airbyte.config.ResourceRequirements@7649d350[cpuRequest=0.2,cpuLimit=1,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}], heartbeat=io.airbyte.config.ResourceRequirements@37e7e089[cpuRequest=0.05,cpuLimit=0.2,memoryRequest=25Mi,memoryLimit=50Mi,additionalProperties={}], stdErr=io.airbyte.config.ResourceRequirements@58f05266[cpuRequest=0.01,cpuLimit=0.5,memoryRequest=25Mi,memoryLimit=50Mi,additionalProperties={}], stdIn=io.airbyte.config.ResourceRequirements@2b3ac26d[cpuRequest=0.1,cpuLimit=1,memoryRequest=25Mi,memoryLimit=50Mi,additionalProperties={}], stdOut=io.airbyte.config.ResourceRequirements@13515709[cpuRequest=0.01,cpuLimit=0.5,memoryRequest=25Mi,memoryLimit=50Mi,additionalProperties={}]] and allowedHosts null
2024-02-21 17:37:48 replication-orchestrator > destination-google-sheets-write-8629198-4-zjzfb stdoutLocalPort = 9880
2024-02-21 17:37:48 replication-orchestrator > destination-google-sheets-write-8629198-4-zjzfb stderrLocalPort = 9879
2024-02-21 17:37:48 replication-orchestrator > Using default value for environment variable SYNC_JOB_INIT_RETRY_TIMEOUT_MINUTES: '5'
2024-02-21 17:37:48 replication-orchestrator > Creating stdout socket server...
2024-02-21 17:37:48 replication-orchestrator > Creating stderr socket server...
2024-02-21 17:37:49 replication-orchestrator > Creating pod destination-google-sheets-write-8629198-4-zjzfb...
2024-02-21 17:37:51 replication-orchestrator > Waiting for init container to be ready before copying files...
2024-02-21 17:37:52 replication-orchestrator > Init container ready..
2024-02-21 17:37:52 replication-orchestrator > Copying files...
2024-02-21 17:37:52 replication-orchestrator > Uploading file: destination_config.json
2024-02-21 17:37:52 replication-orchestrator > kubectl cp /tmp/b129dd99-9069-4982-896f-e340ecba306f/destination_config.json jobs/destination-google-sheets-write-8629198-4-zjzfb:/config/destination_config.json -c init --retries=3
2024-02-21 17:37:52 replication-orchestrator > Waiting for kubectl cp to complete
2024-02-21 17:37:52 replication-orchestrator > kubectl cp complete, closing process
2024-02-21 17:37:52 replication-orchestrator > Uploading file: destination_catalog.json
2024-02-21 17:37:52 replication-orchestrator > kubectl cp /tmp/58c8015f-ef9a-446b-98e7-db2775beba69/destination_catalog.json jobs/destination-google-sheets-write-8629198-4-zjzfb:/config/destination_catalog.json -c init --retries=3
2024-02-21 17:37:52 replication-orchestrator > Waiting for kubectl cp to complete
2024-02-21 17:37:52 replication-orchestrator > kubectl cp complete, closing process
2024-02-21 17:37:52 replication-orchestrator > Uploading file: FINISHED_UPLOADING
2024-02-21 17:37:52 replication-orchestrator > kubectl cp /tmp/9a8ccd9d-d7d6-4aa7-a954-623d2a0e3f5c/FINISHED_UPLOADING jobs/destination-google-sheets-write-8629198-4-zjzfb:/config/FINISHED_UPLOADING -c init --retries=3
2024-02-21 17:37:52 replication-orchestrator > Waiting for kubectl cp to complete
2024-02-21 17:37:53 replication-orchestrator > kubectl cp complete, closing process
2024-02-21 17:37:53 replication-orchestrator > Waiting until pod is ready...
2024-02-21 17:37:53 replication-orchestrator > Setting stdout...
2024-02-21 17:37:53 replication-orchestrator > Setting stderr...
2024-02-21 17:37:54 replication-orchestrator > Reading pod IP...
2024-02-21 17:37:54 replication-orchestrator > Pod IP: 172.25.5.194
2024-02-21 17:37:54 replication-orchestrator > Creating stdin socket...
2024-02-21 17:37:54 replication-orchestrator > Writing messages to protocol version 0.2.0
2024-02-21 17:37:54 replication-orchestrator > Reading messages from protocol version 0.2.0
2024-02-21 17:37:54 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-02-21 17:37:54 replication-orchestrator > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-02-21 17:37:54 replication-orchestrator > Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: ''
2024-02-21 17:37:54 replication-orchestrator > Attempting to start pod = source-sendgrid-read-8629198-4-rgdkr for airbyte/source-sendgrid:0.4.2 with resources ConnectorResourceRequirements[main=io.airbyte.config.ResourceRequirements@70376c6[cpuRequest=0.2,cpuLimit=1,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}], heartbeat=io.airbyte.config.ResourceRequirements@37e7e089[cpuRequest=0.05,cpuLimit=0.2,memoryRequest=25Mi,memoryLimit=50Mi,additionalProperties={}], stdErr=io.airbyte.config.ResourceRequirements@2e89310c[cpuRequest=0.01,cpuLimit=0.5,memoryRequest=25Mi,memoryLimit=50Mi,additionalProperties={}], stdIn=null, stdOut=io.airbyte.config.ResourceRequirements@55c75b72[cpuRequest=0.2,cpuLimit=1,memoryRequest=25Mi,memoryLimit=50Mi,additionalProperties={}]] and allowedHosts io.airbyte.config.AllowedHosts@22d23c27[hosts=[api.sendgrid.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-02-21 17:37:54 replication-orchestrator > source-sendgrid-read-8629198-4-rgdkr stdoutLocalPort = 9878
2024-02-21 17:37:54 replication-orchestrator > source-sendgrid-read-8629198-4-rgdkr stderrLocalPort = 9877
2024-02-21 17:37:55 replication-orchestrator > Creating stdout socket server...
2024-02-21 17:37:55 replication-orchestrator > Creating stderr socket server...
2024-02-21 17:37:55 replication-orchestrator > Creating pod source-sendgrid-read-8629198-4-rgdkr...
2024-02-21 17:37:55 replication-orchestrator > Waiting for init container to be ready before copying files...
2024-02-21 17:37:55 replication-orchestrator > Init container ready..
2024-02-21 17:37:55 replication-orchestrator > Copying files...
2024-02-21 17:37:55 replication-orchestrator > Uploading file: input_state.json
2024-02-21 17:37:55 replication-orchestrator > kubectl cp /tmp/1514c91e-428d-4c6b-a461-163526207012/input_state.json jobs/source-sendgrid-read-8629198-4-rgdkr:/config/input_state.json -c init --retries=3
2024-02-21 17:37:55 replication-orchestrator > Waiting for kubectl cp to complete
2024-02-21 17:37:56 replication-orchestrator > kubectl cp complete, closing process
2024-02-21 17:37:56 replication-orchestrator > Uploading file: source_config.json
2024-02-21 17:37:56 replication-orchestrator > kubectl cp /tmp/024257db-ad4c-4939-b5e7-cec0a8f33cff/source_config.json jobs/source-sendgrid-read-8629198-4-rgdkr:/config/source_config.json -c init --retries=3
2024-02-21 17:37:56 replication-orchestrator > Waiting for kubectl cp to complete
2024-02-21 17:37:56 replication-orchestrator > kubectl cp complete, closing process
2024-02-21 17:37:56 replication-orchestrator > Uploading file: source_catalog.json
2024-02-21 17:37:56 replication-orchestrator > kubectl cp /tmp/2bf0f1c5-92f2-42b6-b839-dab834996566/source_catalog.json jobs/source-sendgrid-read-8629198-4-rgdkr:/config/source_catalog.json -c init --retries=3
2024-02-21 17:37:56 replication-orchestrator > Waiting for kubectl cp to complete
2024-02-21 17:37:56 replication-orchestrator > kubectl cp complete, closing process
2024-02-21 17:37:56 replication-orchestrator > Uploading file: FINISHED_UPLOADING
2024-02-21 17:37:56 replication-orchestrator > kubectl cp /tmp/5f8ccd11-53c9-40ed-bc29-6028c93eccb1/FINISHED_UPLOADING jobs/source-sendgrid-read-8629198-4-rgdkr:/config/FINISHED_UPLOADING -c init --retries=3
2024-02-21 17:37:56 replication-orchestrator > Waiting for kubectl cp to complete
2024-02-21 17:37:56 replication-orchestrator > kubectl cp complete, closing process
2024-02-21 17:37:56 replication-orchestrator > Waiting until pod is ready...
2024-02-21 17:37:58 replication-orchestrator > Setting stdout...
2024-02-21 17:37:58 replication-orchestrator > Setting stderr...
2024-02-21 17:37:58 replication-orchestrator > Reading pod IP...
2024-02-21 17:37:58 replication-orchestrator > Pod IP: 172.25.5.195
2024-02-21 17:37:58 replication-orchestrator > Using null stdin output stream...
2024-02-21 17:37:58 replication-orchestrator > Reading messages from protocol version 0.2.0
2024-02-21 17:37:58 replication-orchestrator > Writing async status RUNNING for KubePodInfo[namespace=jobs, name=orchestrator-repl-job-8629198-attempt-4, mainContainerInfo=KubeContainerInfo[image=airbyte/container-orchestrator:dev-77aec3b6b3, pullPolicy=IfNotPresent]]...
2024-02-21 17:37:59 replication-orchestrator > Destination output thread started.
2024-02-21 17:37:59 replication-orchestrator > Replication thread started.
2024-02-21 17:37:59 replication-orchestrator > Starting source heartbeat check. Will check every 1 minutes.
2024-02-21 17:37:59 replication-orchestrator > Waiting for source and destination threads to complete.
2024-02-21 17:37:59 replication-orchestrator > Starting workload heartbeat
2024-02-21 17:37:59 destination > Begin writing to the destination...
2024-02-21 17:38:03 source > Starting syncing SourceSendgrid
2024-02-21 17:38:03 source > Successfully connected to stream campaigns, but got 0 records.
2024-02-21 17:38:03 source > Marking stream campaigns as STARTED
2024-02-21 17:38:03 replication-orchestrator > Attempt 0 to stream status started null:campaigns
2024-02-21 17:38:04 source > Syncing stream: campaigns 
2024-02-21 17:38:04 source > Read 0 records from campaigns stream
2024-02-21 17:38:04 source > Marking stream campaigns as STOPPED
2024-02-21 17:38:04 source > Finished syncing campaigns
2024-02-21 17:38:04 source > SourceSendgrid runtimes:
Syncing stream campaigns 0:00:01.135133
2024-02-21 17:38:06 source > Sleeping 0.501 seconds while waiting for Job: contacts/bceb4438-8b07-4313-ace1-2d1c38882e8e to complete. Current state: pending
2024-02-21 17:38:06 source > Encountered an exception while reading stream contacts
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 116, in read
    stream_is_available, reason = stream_instance.check_availability(logger, self)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py", line 211, in check_availability
    return self.availability_strategy.check_availability(self, logger, source)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 50, in check_availability
    get_first_record_for_slice(stream, stream_slice)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py", line 40, in get_first_record_for_slice
    return next(records_for_slice)
  File "/airbyte/integration_code/source_sendgrid/streams.py", line 215, in read_records
    for record in self.read_with_chunks(*self.download_data(url=url)):
  File "/airbyte/integration_code/source_sendgrid/streams.py", line 302, in download_data
    data_file.write(decompressor.decompress(chunk))
zlib.error: Error -3 while decompressing data: unknown compression method
2024-02-21 17:38:06 source > Marking stream contacts as STOPPED
2024-02-21 17:38:06 replication-orchestrator > Unable to update stream status for event ReplicationAirbyteMessageEvent(airbyteMessageOrigin=SOURCE, airbyteMessage=io.airbyte.protocol.models.AirbyteMessage@2f9ed135[type=TRACE,log=<null>,spec=<null>,connectionStatus=<null>,catalog=<null>,record=<null>,state=<null>,trace=io.airbyte.protocol.models.AirbyteTraceMessage@4ab78512[type=STREAM_STATUS,emittedAt=1.708537086679931E12,error=<null>,estimate=<null>,streamStatus=io.airbyte.protocol.models.AirbyteStreamStatusTraceMessage@623202c2[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@6cb893f1[name=contacts,namespace=<null>,additionalProperties={}],status=INCOMPLETE,additionalProperties={}],analytics=<null>,additionalProperties={}],control=<null>,additionalProperties={}], replicationContext=ReplicationContext[isReset=false, connectionId=d40d3c71-53bc-4abe-bfb6-5b9c1344c345, sourceId=19d02380-f30a-4bd9-bded-65f6b5317805, destinationId=43157528-aaae-412d-970c-f87125c3ccfd, jobId=8629198, attempt=4, workspaceId=b15ce41e-086b-431e-b024-07e299f23f6e, sourceImage=airbyte/source-sendgrid:0.4.2, destinationImage=airbyte/destination-google-sheets:0.2.3], incompleteRunCause=null)
io.airbyte.workers.internal.exception.StreamStatusException: Invalid stream status transition to INCOMPLETE (origin = SOURCE, context = ReplicationContext[isReset=false, connectionId=d40d3c71-53bc-4abe-bfb6-5b9c1344c345, sourceId=19d02380-f30a-4bd9-bded-65f6b5317805, destinationId=43157528-aaae-412d-970c-f87125c3ccfd, jobId=8629198, attempt=4, workspaceId=b15ce41e-086b-431e-b024-07e299f23f6e, sourceImage=airbyte/source-sendgrid:0.4.2, destinationImage=airbyte/destination-google-sheets:0.2.3], stream = null:contacts)
    at io.airbyte.workers.internal.bookkeeping.StreamStatusTracker.handleStreamIncomplete-zkXUZaI(StreamStatusTracker.kt:249) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.internal.bookkeeping.StreamStatusTracker.handleStreamStatus(StreamStatusTracker.kt:107) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.internal.bookkeeping.StreamStatusTracker.track(StreamStatusTracker.kt:60) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.internal.bookkeeping.events.AirbyteStreamStatusMessageEventListener.onApplicationEvent(AirbyteStreamStatusMessageEventListener.kt:18) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.internal.bookkeeping.events.AirbyteStreamStatusMessageEventListener.onApplicationEvent(AirbyteStreamStatusMessageEventListener.kt:15) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.10.1.jar:3.10.1]
    at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.10.1.jar:3.10.1]
    at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.10.1.jar:3.10.1]
    at io.airbyte.workers.internal.bookkeeping.events.ReplicationAirbyteMessageEventPublishingHelper.publishStatusEvent(ReplicationAirbyteMessageEventPublishingHelper.kt:67) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.general.ReplicationWorkerHelper.internalProcessMessageFromSource(ReplicationWorkerHelper.kt:350) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.general.ReplicationWorkerHelper.processMessageFromSource(ReplicationWorkerHelper.kt:418) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:350) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
2024-02-21 17:38:06 source > Finished syncing contacts
2024-02-21 17:38:06 source > SourceSendgrid runtimes:
Syncing stream campaigns 0:00:01.135133
Syncing stream contacts 0:00:02.391540
2024-02-21 17:38:06 source > Error -3 while decompressing data: unknown compression method
Traceback (most recent call last):
  File "/airbyte/integration_code/main.py", line 8, in <module>
    run()
  File "/airbyte/integration_code/source_sendgrid/run.py", line 14, in run
    launch(source, sys.argv[1:])
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 209, in launch
    for message in source_entrypoint.run(parsed_args):
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 116, in run
    yield from map(AirbyteEntrypoint.airbyte_message_to_string, self.read(source_spec, config, config_catalog, state))
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 158, in read
    yield from self.source.read(self.logger, config, catalog, state)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 142, in read
    raise e
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 116, in read
    stream_is_available, reason = stream_instance.check_availability(logger, self)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py", line 211, in check_availability
    return self.availability_strategy.check_availability(self, logger, source)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 50, in check_availability
    get_first_record_for_slice(stream, stream_slice)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py", line 40, in get_first_record_for_slice
    return next(records_for_slice)
  File "/airbyte/integration_code/source_sendgrid/streams.py", line 215, in read_records
    for record in self.read_with_chunks(*self.download_data(url=url)):
  File "/airbyte/integration_code/source_sendgrid/streams.py", line 302, in download_data
    data_file.write(decompressor.decompress(chunk))
zlib.error: Error -3 while decompressing data: unknown compression method
2024-02-21 17:38:06 replication-orchestrator > Source has no more messages, closing connection.
2024-02-21 17:38:07 replication-orchestrator > (pod: jobs / source-sendgrid-read-8629198-4-rgdkr) - Closed all resources for pod
2024-02-21 17:38:07 replication-orchestrator > Attempt 0 to update stream status incomplete null:campaigns
2024-02-21 17:38:12 destination > Auth session is expired. Refreshing...
2024-02-21 17:38:12 destination > Successfully refreshed auth session
2024-02-21 17:38:12 destination > Skipping empty stream: campaigns
2024-02-21 17:38:12 destination > Skipping empty stream: contacts
2024-02-21 17:38:12 destination > Skipping empty stream: global_suppressions
2024-02-21 17:38:12 destination > Skipping empty stream: blocks
2024-02-21 17:38:12 destination > Skipping empty stream: bounces
2024-02-21 17:38:12 destination > Skipping empty stream: invalid_emails
2024-02-21 17:38:12 destination > Skipping empty stream: spam_reports
2024-02-21 17:38:12 destination > Skipping empty stream: unsubscribe_groups
2024-02-21 17:38:12 destination > No duplicated records found for stream: global_suppressions
2024-02-21 17:38:12 destination > No duplicated records found for stream: blocks
2024-02-21 17:38:12 destination > No duplicated records found for stream: bounces
2024-02-21 17:38:12 destination > No duplicated records found for stream: invalid_emails
2024-02-21 17:38:12 destination > No duplicated records found for stream: spam_reports
2024-02-21 17:38:12 destination > Writing complete.
2024-02-21 17:38:14 replication-orchestrator > (pod: jobs / destination-google-sheets-write-8629198-4-zjzfb) - Closed all resources for pod
2024-02-21 17:38:14 replication-orchestrator > thread status... timeout thread: false , replication thread: true
2024-02-21 17:38:14 replication-orchestrator > Sync worker failed.
java.util.concurrent.ExecutionException: io.airbyte.workers.internal.exception.SourceException: Source didn't exit properly - check the logs!
    at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:213) [io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) [io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:63) [io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.container_orchestrator.orchestrator.ReplicationJobOrchestrator.runWithWorkloadEnabled(ReplicationJobOrchestrator.java:148) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
    at io.airbyte.container_orchestrator.orchestrator.ReplicationJobOrchestrator.runJob(ReplicationJobOrchestrator.java:127) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
    at io.airbyte.container_orchestrator.Application.run(Application.java:78) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
    at io.airbyte.container_orchestrator.Application.main(Application.java:38) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
    Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
        at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:158) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
        at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:161) [io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
        at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) [io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
        at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:63) [io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
        at io.airbyte.container_orchestrator.orchestrator.ReplicationJobOrchestrator.runWithWorkloadEnabled(ReplicationJobOrchestrator.java:148) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
        at io.airbyte.container_orchestrator.orchestrator.ReplicationJobOrchestrator.runJob(ReplicationJobOrchestrator.java:127) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
        at io.airbyte.container_orchestrator.Application.run(Application.java:78) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
        at io.airbyte.container_orchestrator.Application.main(Application.java:38) [io.airbyte-airbyte-container-orchestrator-dev-77aec3b6b3.jar:?]
Caused by: io.airbyte.workers.internal.exception.SourceException: Source didn't exit properly - check the logs!
    at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:367) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.base/java.lang.Thread.run(Thread.java:1583) ~[?:?]
Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:158) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:365) ~[io.airbyte-airbyte-commons-worker-dev-77aec3b6b3.jar:?]
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.base/java.lang.Thread.run(Thread.java:1583) ~[?:?]
2024-02-21 17:38:14 replication-orchestrator > sync summary: {
  "status" : "failed",
  "startTime" : 1708537068783,
  "endTime" : 1708537094315,
  "totalStats" : {
    "bytesEmitted" : 0,
    "destinationStateMessagesEmitted" : 0,
    "destinationWriteEndTime" : 1708537093078,
    "destinationWriteStartTime" : 1708537068797,
    "meanSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBeforeSourceStateMessageEmitted" : 0,
    "meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "recordsEmitted" : 0,
    "replicationEndTime" : 0,
    "replicationStartTime" : 1708537068783,
    "sourceReadEndTime" : 0,
    "sourceReadStartTime" : 1708537074987,
    "sourceStateMessagesEmitted" : 0
  },
  "streamStats" : [ ]
}
2024-02-21 17:38:14 replication-orchestrator > failures: [ {
  "failureOrigin" : "source",
  "failureType" : "system_error",
  "internalMessage" : "Error -3 while decompressing data: unknown compression method",
  "externalMessage" : "Something went wrong in the connector. See the logs for more details.",
  "metadata" : {
    "attemptNumber" : 4,
    "jobId" : 8629198,
    "from_trace_message" : true,
    "connector_command" : "read"
  },
  "stacktrace" : "Traceback (most recent call last):\n  File \"/airbyte/integration_code/main.py\", line 8, in <module>\n    run()\n  File \"/airbyte/integration_code/source_sendgrid/run.py\", line 14, in run\n    launch(source, sys.argv[1:])\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 209, in launch\n    for message in source_entrypoint.run(parsed_args):\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 116, in run\n    yield from map(AirbyteEntrypoint.airbyte_message_to_string, self.read(source_spec, config, config_catalog, state))\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 158, in read\n    yield from self.source.read(self.logger, config, catalog, state)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 142, in read\n    raise e\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 116, in read\n    stream_is_available, reason = stream_instance.check_availability(logger, self)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py\", line 211, in check_availability\n    return self.availability_strategy.check_availability(self, logger, source)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py\", line 50, in check_availability\n    get_first_record_for_slice(stream, stream_slice)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py\", line 40, in get_first_record_for_slice\n    return next(records_for_slice)\n  File \"/airbyte/integration_code/source_sendgrid/streams.py\", line 215, in read_records\n    for record in self.read_with_chunks(*self.download_data(url=url)):\n  File \"/airbyte/integration_code/source_sendgrid/streams.py\", line 302, in download_data\n    data_file.write(decompressor.decompress(chunk))\nzlib.error: Error -3 while decompressing data: unknown compression method\n",
  "timestamp" : 1708537086682
}, {
  "failureOrigin" : "source",
  "internalMessage" : "Source didn't exit properly - check the logs!",
  "externalMessage" : "Something went wrong within the source connector",
  "metadata" : {
    "attemptNumber" : 4,
    "jobId" : 8629198,
    "connector_command" : "read"
  },
  "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source didn't exit properly - check the logs!\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:367)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:158)\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:365)\n\t... 4 more\n",
  "timestamp" : 1708537087203
} ]
2024-02-21 17:38:14 replication-orchestrator > 
2024-02-21 17:38:14 replication-orchestrator > ----- END REPLICATION -----
2024-02-21 17:38:14 replication-orchestrator > 
2024-02-21 17:38:15 replication-orchestrator > Returning output...
2024-02-21 17:38:15 replication-orchestrator > Writing async status SUCCEEDED for KubePodInfo[namespace=jobs, name=orchestrator-repl-job-8629198-attempt-4, mainContainerInfo=KubeContainerInfo[image=airbyte/container-orchestrator:dev-77aec3b6b3, pullPolicy=IfNotPresent]]...
2024-02-21 17:37:43 INFO c.l.l.LDSLF4J$ChannelImpl(log):73 - Enabling streaming API
2024-02-21 17:37:43 INFO c.l.l.LDSLF4J$ChannelImpl(log):94 - Waiting up to 5000 milliseconds for LaunchDarkly client to start...
2024-02-21 17:37:46 INFO i.a.m.l.MetricClientFactory(initializeDatadogMetricClient):126 - Initializing DatadogMetricClient
2024-02-21 17:37:46 INFO i.a.c.EnvConfigs(getEnvOrDefault):694 - Using default value for environment variable DD_CONSTANT_TAGS: ''
2024-02-21 17:37:46 INFO i.a.m.l.DogStatsDMetricClient(initialize):52 - Starting DogStatsD client..
2024-02-21 17:37:18 platform > Executing worker wrapper. Airbyte version: dev-77aec3b6b3-cloud
2024-02-21 17:37:18 platform > Attempt 0 to save workflow id for cancellation
2024-02-21 17:37:18 platform > Creating workload d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync
2024-02-21 17:37:18 platform > Unknown feature flag "workload.polling.interval"; returning default value
2024-02-21 17:37:18 platform > Workload d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync is pending
2024-02-21 17:37:18 INFO i.a.w.l.c.WorkloadApiClient(claim):69 - Claimed: true for d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync via API for prod-dataplane-gcp-us-west3-0
2024-02-21 17:37:18 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CHECK_STATUS — (workloadId = d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync) — (dataplaneId = prod-dataplane-gcp-us-west3-0)
2024-02-21 17:37:18 INFO i.a.w.l.p.s.CheckStatusStage(applyStage):61 - No pod found running for workload d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync
2024-02-21 17:37:18 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: BUILD — (workloadId = d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync) — (dataplaneId = prod-dataplane-gcp-us-west3-0)
2024-02-21 17:37:18 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):297 - Attempt 0 to retrieve the connection
2024-02-21 17:37:18 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):297 - Attempt 0 to retrieve the state
2024-02-21 17:37:18 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: MUTEX — (workloadId = d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync) — (dataplaneId = prod-dataplane-gcp-us-west3-0)
2024-02-21 17:37:18 INFO i.a.w.l.p.s.EnforceMutexStage(applyStage):55 - Mutex key: d40d3c71-53bc-4abe-bfb6-5b9c1344c345 specified for workload: d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync. Attempting to delete existing pods...
2024-02-21 17:37:19 INFO i.a.w.l.p.s.EnforceMutexStage(applyStage):59 - Existing pods for mutex key: d40d3c71-53bc-4abe-bfb6-5b9c1344c345 deleted.
2024-02-21 17:37:19 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: LAUNCH — (workloadId = d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync) — (dataplaneId = prod-dataplane-gcp-us-west3-0)
2024-02-21 17:37:58 INFO i.a.w.l.c.WorkloadApiClient(updateStatusToLaunched):54 - Attempting to update workload: d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync to LAUNCHED.
2024-02-21 17:37:58 INFO i.a.w.l.p.h.SuccessHandler(accept):61 - Pipeline completed for workload: d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync.
2024-02-21 17:38:19 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=5, successivePartialFailureLimit=1000, totalPartialFailureLimit=10, successiveCompleteFailures=5, totalCompleteFailures=5, successivePartialFailures=0, totalPartialFailures=0)
 Backoff before next attempt: 13 minutes 30 seconds
2024-02-21 17:38:19 platform > Failing job: 8629198, reason: Job failed after too many retries for connection d40d3c71-53bc-4abe-bfb6-5b9c1344c345
2024-02-21 17:38:18 platform > Workload d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync has returned a terminal status of failure.  Fetching output...
2024-02-21 17:38:18 platform > Replication output for workload d40d3c71-53bc-4abe-bfb6-5b9c1344c345_8629198_4_sync : io.airbyte.config.ReplicationOutput@26ebbbf3[replicationAttemptSummary=io.airbyte.config.ReplicationAttemptSummary@40c23dcd[status=failed,recordsSynced=<null>,bytesSynced=<null>,startTime=1708537068783,endTime=1708537094315,totalStats=io.airbyte.config.SyncStats@57d0792a[bytesCommitted=<null>,bytesEmitted=0,destinationStateMessagesEmitted=0,destinationWriteEndTime=1708537093078,destinationWriteStartTime=1708537068797,estimatedBytes=<null>,estimatedRecords=<null>,meanSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBetweenStateMessageEmittedandCommitted=<null>,meanSecondsBetweenStateMessageEmittedandCommitted=0,recordsEmitted=0,recordsCommitted=<null>,replicationEndTime=0,replicationStartTime=1708537068783,sourceReadEndTime=0,sourceReadStartTime=1708537074987,sourceStateMessagesEmitted=0,additionalProperties={}],streamStats=[],performanceMetrics=<null>,additionalProperties={}],state=<null>,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@11a8994[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@7a559e67[stream=io.airbyte.protocol.models.AirbyteStream@2ba8e8b5[name=campaigns,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"id":{"type":"string"},"name":{"type":"string"},"status":{"type":"string"},"is_abtest":{"type":"boolean"},"created_at":{"type":["string","null"],"format":"date-time"},"updated_at":{"type":["string","null"],"format":"date-time"}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=<null>,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@7fe5cdc3[stream=io.airbyte.protocol.models.AirbyteStream@2a66ad82[name=contacts,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"city":{"type":["string","null"]},"line":{"type":["string","null"]},"email":{"type":["string","null"]},"country":{"type":["string","null"]},"facebook":{"type":["string","null"]},"list_ids":{"type":["null","array"]},"whatsapp":{"type":["string","null"]},"last_name":{"type":["string","null"]},"contact_id":{"type":["string","null"]},"created_at":{"type":["string","null"],"format":"date-time"},"first_name":{"type":["string","null"]},"updated_at":{"type":["string","null"],"format":"date-time"},"postal_code":{"type":["string","null"]},"unique_name":{"type":["string","null"]},"phone_number":{"type":["string","null"]},"custom_fields":{"type":["object","null"]},"address_line_1":{"type":["string","null"]},"address_line_2":{"type":["string","null"]},"alternate_emails":{"type":["null","array"]},"state_province_region":{"type":["string","null"]}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[contact_id]],namespace=<null>,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[contact_id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@7c16a80f[stream=io.airbyte.protocol.models.AirbyteStream@10171a78[name=global_suppressions,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"email":{"type":"string"},"created":{"type":"integer"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[created],sourceDefinedPrimaryKey=[[email]],namespace=<null>,additionalProperties={}],syncMode=incremental,cursorField=[created],destinationSyncMode=append_dedup,primaryKey=[[email]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@414d466c[stream=io.airbyte.protocol.models.AirbyteStream@5e58e97a[name=blocks,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"email":{"type":"string"},"reason":{"type":"string"},"status":{"type":"string"},"created":{"type":"integer"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[created],sourceDefinedPrimaryKey=[[email]],namespace=<null>,additionalProperties={}],syncMode=incremental,cursorField=[created],destinationSyncMode=append_dedup,primaryKey=[[email]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@1747746e[stream=io.airbyte.protocol.models.AirbyteStream@3f2b7824[name=bounces,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"email":{"type":"string"},"reason":{"type":"string"},"status":{"type":"string"},"created":{"type":"integer"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[created],sourceDefinedPrimaryKey=[[email]],namespace=<null>,additionalProperties={}],syncMode=incremental,cursorField=[created],destinationSyncMode=append_dedup,primaryKey=[[email]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@14fe278b[stream=io.airbyte.protocol.models.AirbyteStream@df41f57[name=invalid_emails,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"email":{"type":"string"},"reason":{"type":"string"},"created":{"type":"integer"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[created],sourceDefinedPrimaryKey=[[email]],namespace=<null>,additionalProperties={}],syncMode=incremental,cursorField=[created],destinationSyncMode=append_dedup,primaryKey=[[email]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@21bcee83[stream=io.airbyte.protocol.models.AirbyteStream@453258c8[name=spam_reports,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"ip":{"type":"string"},"email":{"type":"string"},"created":{"type":"integer"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[created],sourceDefinedPrimaryKey=[[email]],namespace=<null>,additionalProperties={}],syncMode=incremental,cursorField=[created],destinationSyncMode=append_dedup,primaryKey=[[email]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@2095ed9a[stream=io.airbyte.protocol.models.AirbyteStream@db64078[name=unsubscribe_groups,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"id":{"type":["null","integer"]},"name":{"type":["null","string"]},"is_default":{"type":["null","boolean"]},"description":{"type":["null","string"]},"unsubscribes":{"type":["null","integer"]},"last_email_sent_at":{"type":["null","integer"]}},"additionalProperties":true},supportedSyncModes=[full_refresh],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=<null>,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@4bb87fbb[failureOrigin=source,failureType=system_error,internalMessage=Error -3 while decompressing data: unknown compression method,externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@1c9b8a1c[additionalProperties={attemptNumber=4, jobId=8629198, from_trace_message=true, connector_command=read}],stacktrace=Traceback (most recent call last):
  File "/airbyte/integration_code/main.py", line 8, in <module>
    run()
  File "/airbyte/integration_code/source_sendgrid/run.py", line 14, in run
    launch(source, sys.argv[1:])
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 209, in launch
    for message in source_entrypoint.run(parsed_args):
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 116, in run
    yield from map(AirbyteEntrypoint.airbyte_message_to_string, self.read(source_spec, config, config_catalog, state))
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 158, in read
    yield from self.source.read(self.logger, config, catalog, state)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 142, in read
    raise e
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 116, in read
    stream_is_available, reason = stream_instance.check_availability(logger, self)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py", line 211, in check_availability
    return self.availability_strategy.check_availability(self, logger, source)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 50, in check_availability
    get_first_record_for_slice(stream, stream_slice)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py", line 40, in get_first_record_for_slice
    return next(records_for_slice)
  File "/airbyte/integration_code/source_sendgrid/streams.py", line 215, in read_records
    for record in self.read_with_chunks(*self.download_data(url=url)):
  File "/airbyte/integration_code/source_sendgrid/streams.py", line 302, in download_data
    data_file.write(decompressor.decompress(chunk))
zlib.error: Error -3 while decompressing data: unknown compression method
,retryable=<null>,timestamp=1708537086682,additionalProperties={}], io.airbyte.config.FailureReason@13ed4d28[failureOrigin=source,failureType=<null>,internalMessage=Source didn't exit properly - check the logs!,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@2ce76aad[additionalProperties={attemptNumber=4, jobId=8629198, connector_command=read}],stacktrace=io.airbyte.workers.internal.exception.SourceException: Source didn't exit properly - check the logs!
    at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:367)
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    at java.base/java.lang.Thread.run(Thread.java:1583)
Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:158)
    at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:365)
    ... 4 more
,retryable=<null>,timestamp=1708537087203,additionalProperties={}]],additionalProperties={}]

Contribute

marcosmarxm commented 3 months ago

Hello @blakels Sendgrid Connector is a vital part of our community, but it's not currently on our roadmap for updates. This means it might take some time before it gets prioritized by the Airbyte Team. However, we encourage community involvement to improve it, and your contributions are welcome! If you're interested, please reach out to me on Slack so we can discuss how you can help. Thanks for your support!