airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
15.99k stars 4.1k forks source link

[source-freshdesk] Large sync partial success #38687

Open sluo2024 opened 5 months ago

sluo2024 commented 5 months ago

Connector Name

source freshdesk

Connector Version

Freshdesk v3.1.0 CERTIFIED

What step the error happened?

None

Relevant information

Hello, I am loading freshdesk as data source to snowflake as destination. the first load was taking long time like 80 hours. I canceled the sync, then it reported ~1200 records are loaded but a lot of records are extracted. I looked into the destination actually more than 1.7Million records are loaded . so the job status looks off for the loaded records.

Screenshot 2024-05-23 at 3 37 07 PM

I did a second sync and select the incremental+dedup model for the tickets table. I expect more records will be loaded this time. after 48 hours, it seems the sync did not load more data into the tickets table. although airbyte extracted a lot of records. I. had thought the incremental mode will only pull incremental data instead of refresh the whole table. can someone help to explain what could he possible reason the load does not look like incremental? and any instruction how can i load the remaining data from ticket table?

Thanks a lot!

Relevant log output

2024-05-22 23:07:46 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=10, successiveCompleteFailures=0, totalCompleteFailures=0, successivePartialFailures=2, totalPartialFailures=2)
2024-05-22 23:07:46 platform > Backing off for: 0 seconds.
2024-05-22 23:07:47 platform > Docker volume job log path: /tmp/workspace/39/2/logs.log
2024-05-22 23:07:47 platform > Executing worker wrapper. Airbyte version: 0.59.0
2024-05-22 23:07:47 platform > Attempt 0 to save workflow id for cancellation
2024-05-22 23:07:47 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:07:47 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:07:47 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:07:47 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:07:47 platform > 
2024-05-22 23:07:47 platform > ----- START CHECK -----
2024-05-22 23:07:47 platform > 
2024-05-22 23:07:47 platform > Checking if airbyte/source-freshdesk:3.1.0 exists...
2024-05-22 23:07:47 platform > airbyte/source-freshdesk:3.1.0 was found locally.
2024-05-22 23:07:47 platform > Creating docker container = source-freshdesk-check-39-2-nsofz with resources io.airbyte.config.ResourceRequirements@50a514d7[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@246940aa[hosts=[*.freshdesk.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-05-22 23:07:47 platform > Preparing command: docker run --rm --init -i -w /data/39/2 --log-driver none --name source-freshdesk-check-39-2-nsofz --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-freshdesk:3.1.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE=dev -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.59.0 -e WORKER_JOB_ID=39 airbyte/source-freshdesk:3.1.0 check --config source_config.json
2024-05-22 23:07:47 platform > Reading messages from protocol version 0.2.0
2024-05-22 23:07:50 platform > Check succeeded
2024-05-22 23:07:50 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@430872a3[status=succeeded,message=<null>,additionalProperties={}]
2024-05-22 23:07:50 platform > 
2024-05-22 23:07:50 platform > ----- END CHECK -----
2024-05-22 23:07:50 platform > 
2024-05-22 23:07:51 platform > Docker volume job log path: /tmp/workspace/39/2/logs.log
2024-05-22 23:07:51 platform > Executing worker wrapper. Airbyte version: 0.59.0
2024-05-22 23:07:51 platform > Attempt 0 to save workflow id for cancellation
2024-05-22 23:07:51 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:07:51 platform > 
2024-05-22 23:07:51 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:07:51 platform > ----- START CHECK -----
2024-05-22 23:07:51 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:07:51 platform > 
2024-05-22 23:07:51 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:07:51 platform > Checking if airbyte/destination-snowflake:3.7.2 exists...
2024-05-22 23:07:51 platform > airbyte/destination-snowflake:3.7.2 was found locally.
2024-05-22 23:07:51 platform > Creating docker container = destination-snowflake-check-39-2-kcquh with resources io.airbyte.config.ResourceRequirements@6aee2b66[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-05-22 23:07:51 platform > Preparing command: docker run --rm --init -i -w /data/39/2 --log-driver none --name destination-snowflake-check-39-2-kcquh --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:3.7.2 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE=dev -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.59.0 -e WORKER_JOB_ID=39 airbyte/destination-snowflake:3.7.2 check --config source_config.json
2024-05-22 23:07:51 platform > Reading messages from protocol version 0.2.0
2024-05-22 23:07:52 platform > INFO main i.a.c.i.b.a.AdaptiveDestinationRunner$Runner(getDestination):55 Running destination under deployment mode: OSS
2024-05-22 23:07:52 platform > INFO main i.a.c.i.b.a.AdaptiveDestinationRunner$Runner(run):68 Starting destination: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2024-05-22 23:07:52 platform > INFO main i.a.c.i.b.IntegrationCliParser$Companion(parseOptions):145 integration args: {check=null, config=source_config.json}
2024-05-22 23:07:52 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):124 Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2024-05-22 23:07:52 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):125 Command: CHECK
2024-05-22 23:07:52 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):126 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2024-05-22 23:07:53 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword pattern_descriptor - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:07:53 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:07:53 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:07:53 platform > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_hidden - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:07:53 platform > INFO main i.a.c.i.d.j.c.SwitchingDestination(check):53 Using destination type: INTERNAL_STAGING
2024-05-22 23:07:53 platform > INFO main c.z.h.HikariDataSource(getConnection):109 HikariPool-1 - Starting...
2024-05-22 23:07:54 platform > INFO main c.z.h.p.HikariPool(checkFailFast):554 HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@767f6ee7
2024-05-22 23:07:54 platform > INFO main c.z.h.HikariDataSource(getConnection):122 HikariPool-1 - Start completed.
2024-05-22 23:07:55 platform > INFO main i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:07:55 platform > INFO main i.a.c.i.d.r.BaseSerializedBuffer(flush):156 Finished writing data to ada22f52-eb9e-4a8a-aa51-3a72675897001570737424292278396.csv.gz (106 bytes)
2024-05-22 23:07:56 platform > INFO main i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:07:56 platform > INFO main i.a.i.d.s.SnowflakeInternalStagingSqlOperations(uploadRecordsToStage):108 Successfully loaded records to stage "FRESHDESK"."_airbyte_connection_test_a64b2ec47da54edc9227d84ff447e48d"/ with 0 re-attempt(s)
2024-05-22 23:07:56 platform > INFO main c.z.h.HikariDataSource(close):349 HikariPool-1 - Shutdown initiated...
2024-05-22 23:07:56 platform > INFO main c.z.h.HikariDataSource(close):351 HikariPool-1 - Shutdown completed.
2024-05-22 23:07:56 platform > INFO main i.a.c.i.b.IntegrationRunner(runInternal):251 Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2024-05-22 23:07:56 platform > INFO main i.a.c.i.b.a.AdaptiveDestinationRunner$Runner(run):70 Completed destination: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2024-05-22 23:07:58 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@70bde6fc[status=succeeded,message=<null>,additionalProperties={}]
2024-05-22 23:07:58 platform > 
2024-05-22 23:07:58 platform > ----- END CHECK -----
2024-05-22 23:07:58 platform > 
2024-05-22 23:08:02 platform > Docker volume job log path: /tmp/workspace/39/2/logs.log
2024-05-22 23:08:02 platform > Executing worker wrapper. Airbyte version: 0.59.0
2024-05-22 23:08:02 platform > Attempt 0 to save workflow id for cancellation
2024-05-22 23:08:02 platform > start sync worker. job id: 39 attempt id: 2
2024-05-22 23:08:02 platform > 
2024-05-22 23:08:02 platform > ----- START REPLICATION -----
2024-05-22 23:08:02 platform > 
2024-05-22 23:08:02 platform > Running destination...
2024-05-22 23:08:02 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:08:02 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:08:02 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:08:02 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:08:02 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:08:02 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-05-22 23:08:02 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:08:02 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-05-22 23:08:02 platform > Checking if airbyte/source-freshdesk:3.1.0 exists...
2024-05-22 23:08:02 platform > Checking if airbyte/destination-snowflake:3.7.2 exists...
2024-05-22 23:08:02 platform > airbyte/source-freshdesk:3.1.0 was found locally.
2024-05-22 23:08:02 platform > airbyte/destination-snowflake:3.7.2 was found locally.
2024-05-22 23:08:02 platform > Creating docker container = source-freshdesk-read-39-2-yrncb with resources io.airbyte.config.ResourceRequirements@19c4429a[cpuRequest=0.2,cpuLimit=1,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@4166dbe5[hosts=[*.freshdesk.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-05-22 23:08:02 platform > Preparing command: docker run --rm --init -i -w /data/39/2 --log-driver none --name source-freshdesk-read-39-2-yrncb -e CONCURRENT_SOURCE_STREAM_READ=false --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-freshdesk:3.1.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE=dev -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.59.0 -e WORKER_JOB_ID=39 --cpus=1 --memory-reservation=1Gi --memory=2Gi airbyte/source-freshdesk:3.1.0 read --config source_config.json --catalog source_catalog.json
2024-05-22 23:08:02 platform > Creating docker container = destination-snowflake-write-39-2-qepwn with resources io.airbyte.config.ResourceRequirements@76a08b1d[cpuRequest=0.2,cpuLimit=1,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts null
2024-05-22 23:08:02 platform > Preparing command: docker run --rm --init -i -w /data/39/2 --log-driver none --name destination-snowflake-write-39-2-qepwn --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:3.7.2 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE=dev -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.59.0 -e WORKER_JOB_ID=39 --cpus=1 --memory-reservation=1Gi --memory=2Gi airbyte/destination-snowflake:3.7.2 write --config destination_config.json --catalog destination_catalog.json
2024-05-22 23:08:02 platform > Reading messages from protocol version 0.2.0
2024-05-22 23:08:02 platform > Writing messages to protocol version 0.2.0
2024-05-22 23:08:02 platform > Reading messages from protocol version 0.2.0
2024-05-22 23:08:02 platform > readFromSource: start
2024-05-22 23:08:02 platform > writeToDestination: start
2024-05-22 23:08:02 platform > processMessage: start
2024-05-22 23:08:02 platform > Starting source heartbeat check. Will check threshold of 10800 seconds, every 1 minutes.
2024-05-22 23:08:02 platform > readFromDestination: start
2024-05-22 23:08:05 destination > INFO main i.a.c.i.b.a.AdaptiveDestinationRunner$Runner(getDestination):55 Running destination under deployment mode: OSS
2024-05-22 23:08:05 destination > INFO main i.a.c.i.b.a.AdaptiveDestinationRunner$Runner(run):68 Starting destination: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2024-05-22 23:08:06 destination > INFO main i.a.c.i.b.IntegrationCliParser$Companion(parseOptions):145 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2024-05-22 23:08:06 destination > INFO main i.a.c.i.b.IntegrationRunner(runInternal):124 Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2024-05-22 23:08:06 destination > INFO main i.a.c.i.b.IntegrationRunner(runInternal):125 Command: WRITE
2024-05-22 23:08:06 source > Starting syncing SourceFreshdesk
2024-05-22 23:08:06 destination > INFO main i.a.c.i.b.IntegrationRunner(runInternal):126 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2024-05-22 23:08:06 source > Marking stream agents as STARTED
2024-05-22 23:08:06 source > Syncing stream: agents 
2024-05-22 23:08:06 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword pattern_descriptor - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:08:06 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:08:06 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:08:06 destination > WARN main c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_hidden - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-22 23:08:07 source > Marking stream agents as RUNNING
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.StagingConsumerFactory$Companion(toWriteConfig$lambda$1):316 Write config: WriteConfig{streamName=agents, namespace=FRESHDESK, outputSchemaName=airbyte_internal, tmpTableName=_airbyte_tmp_pfs_agents, outputTableName=FRESHDESK_raw__stream_agents, syncMode=overwrite}
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.StagingConsumerFactory$Companion(toWriteConfig$lambda$1):316 Write config: WriteConfig{streamName=sla_policies, namespace=FRESHDESK, outputSchemaName=airbyte_internal, tmpTableName=_airbyte_tmp_ufm_sla_policies, outputTableName=FRESHDESK_raw__stream_sla_policies, syncMode=append}
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.StagingConsumerFactory$Companion(toWriteConfig$lambda$1):316 Write config: WriteConfig{streamName=products, namespace=FRESHDESK, outputSchemaName=airbyte_internal, tmpTableName=_airbyte_tmp_mwe_products, outputTableName=FRESHDESK_raw__stream_products, syncMode=append}
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.StagingConsumerFactory$Companion(toWriteConfig$lambda$1):316 Write config: WriteConfig{streamName=roles, namespace=FRESHDESK, outputSchemaName=airbyte_internal, tmpTableName=_airbyte_tmp_sqm_roles, outputTableName=FRESHDESK_raw__stream_roles, syncMode=append}
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.StagingConsumerFactory$Companion(toWriteConfig$lambda$1):316 Write config: WriteConfig{streamName=tickets, namespace=FRESHDESK, outputSchemaName=airbyte_internal, tmpTableName=_airbyte_tmp_gdb_tickets, outputTableName=FRESHDESK_raw__stream_tickets, syncMode=append_dedup}
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.StagingConsumerFactory$Companion(toWriteConfig$lambda$1):316 Write config: WriteConfig{streamName=companies, namespace=FRESHDESK, outputSchemaName=airbyte_internal, tmpTableName=_airbyte_tmp_tcv_companies, outputTableName=FRESHDESK_raw__stream_companies, syncMode=append}
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.StagingConsumerFactory$Companion(toWriteConfig$lambda$1):316 Write config: WriteConfig{streamName=time_entries, namespace=FRESHDESK, outputSchemaName=airbyte_internal, tmpTableName=_airbyte_tmp_dlw_time_entries, outputTableName=FRESHDESK_raw__stream_time_entries, syncMode=append}
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.a.b.BufferManager(<init>):43 Max 'memory' available for buffer allocation 742 MB
2024-05-22 23:08:07 destination > INFO main i.a.c.i.b.IntegrationRunner$Companion(consumeWriteStream$io_airbyte_airbyte_cdk_java_airbyte_cdk_airbyte_cdk_core):407 Starting buffered read of input stream
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.a.FlushWorkers(start):74 Start async buffer supervisor
2024-05-22 23:08:07 destination > INFO pool-4-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):89 [ASYNC QUEUE INFO] Global: max: 742.41 MB, allocated: 10 MB (10.0 MB), %% used: 0.013469714189502041 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.a.AsyncStreamConsumer(start):112 class io.airbyte.cdk.integrations.destination.async.AsyncStreamConsumer started.
2024-05-22 23:08:07 destination > INFO main i.a.c.i.d.s.GeneralStagingFunctions(onStartFunction$lambda$0):41 Preparing raw tables in destination started for 7 streams
2024-05-22 23:08:07 destination > INFO pool-7-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):128 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
2024-05-22 23:08:07 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(execute):213 Executing sql 699a3e75-8397-49df-90ed-d36b7701da63-bf71d8ba-4c9f-49f3-a2ef-00fa8f52e0fe: CREATE SCHEMA IF NOT EXISTS "airbyte_internal";
2024-05-22 23:08:07 destination > INFO main c.z.h.HikariDataSource(getConnection):109 HikariPool-1 - Starting...
2024-05-22 23:08:08 source > Read 196 records from agents stream
2024-05-22 23:08:08 platform > Source state message checksum is valid for stream _agents.
2024-05-22 23:08:08 source > Marking stream agents as STOPPED
2024-05-22 23:08:08 source > Finished syncing agents
2024-05-22 23:08:08 source > SourceFreshdesk runtimes:
Syncing stream agents 0:00:01.971785
2024-05-22 23:08:08 source > Marking stream sla_policies as STARTED
2024-05-22 23:08:08 source > Syncing stream: sla_policies 
2024-05-22 23:08:08 source > Marking stream sla_policies as RUNNING
2024-05-22 23:08:08 source > Read 3 records from sla_policies stream
2024-05-22 23:08:08 source > Marking stream sla_policies as STOPPED
2024-05-22 23:08:08 source > Finished syncing sla_policies
2024-05-22 23:08:08 source > SourceFreshdesk runtimes:
Syncing stream agents 0:00:01.971785
Syncing stream sla_policies 0:00:00.315783
2024-05-22 23:08:08 source > Marking stream products as STARTED
2024-05-22 23:08:08 platform > Source state message checksum is valid for stream _sla_policies.
2024-05-22 23:08:08 source > Syncing stream: products 
2024-05-22 23:08:09 source > Marking stream products as RUNNING
2024-05-22 23:08:09 platform > Source state message checksum is valid for stream _products.
2024-05-22 23:08:09 source > Read 33 records from products stream
2024-05-22 23:08:09 source > Marking stream products as STOPPED
2024-05-22 23:08:09 source > Finished syncing products
2024-05-22 23:08:09 source > SourceFreshdesk runtimes:
Syncing stream agents 0:00:01.971785
Syncing stream products 0:00:00.430630
Syncing stream sla_policies 0:00:00.315783
2024-05-22 23:08:09 source > Marking stream roles as STARTED
2024-05-22 23:08:09 source > Syncing stream: roles 
2024-05-22 23:08:09 source > Marking stream roles as RUNNING
2024-05-22 23:08:09 source > Read 12 records from roles stream
2024-05-22 23:08:09 source > Marking stream roles as STOPPED
2024-05-22 23:08:09 source > Finished syncing roles
2024-05-22 23:08:09 source > SourceFreshdesk runtimes:
Syncing stream agents 0:00:01.971785
Syncing stream products 0:00:00.430630
Syncing stream roles 0:00:00.263661
Syncing stream sla_policies 0:00:00.315783
2024-05-22 23:08:09 source > Marking stream tickets as STARTED
2024-05-22 23:08:09 source > Syncing stream: tickets 
2024-05-22 23:08:09 platform > Source state message checksum is valid for stream _roles.
2024-05-22 23:08:10 source > Marking stream tickets as RUNNING
2024-05-22 23:08:10 destination > INFO main c.z.h.p.HikariPool(checkFailFast):554 HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@697173d9
2024-05-22 23:08:10 destination > INFO main c.z.h.HikariDataSource(getConnection):122 HikariPool-1 - Start completed.
2024-05-22 23:08:11 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(execute):232 Sql 699a3e75-8397-49df-90ed-d36b7701da63-bf71d8ba-4c9f-49f3-a2ef-00fa8f52e0fe completed in 3469 ms
2024-05-22 23:08:11 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(execute):213 Executing sql 699a3e75-8397-49df-90ed-d36b7701da63-8a2b8469-9c9d-4f34-9066-4d09532f9f00: CREATE SCHEMA IF NOT EXISTS "FRESHDESK";
2024-05-22 23:08:11 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(execute):232 Sql 699a3e75-8397-49df-90ed-d36b7701da63-8a2b8469-9c9d-4f34-9066-4d09532f9f00 completed in 264 ms
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):19 Assessing whether migration is necessary for stream AGENTS
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):44 Checking whether v1 raw table _airbyte_raw_agents in dataset FRESHDESK exists
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):19 Assessing whether migration is necessary for stream SLA_POLICIES
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):44 Checking whether v1 raw table _airbyte_raw_sla_policies in dataset FRESHDESK exists
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):19 Assessing whether migration is necessary for stream PRODUCTS
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):44 Checking whether v1 raw table _airbyte_raw_products in dataset FRESHDESK exists
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):19 Assessing whether migration is necessary for stream ROLES
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):44 Checking whether v1 raw table _airbyte_raw_roles in dataset FRESHDESK exists
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):19 Assessing whether migration is necessary for stream TICKETS
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):44 Checking whether v1 raw table _airbyte_raw_tickets in dataset FRESHDESK exists
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):19 Assessing whether migration is necessary for stream COMPANIES
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):44 Checking whether v1 raw table _airbyte_raw_companies in dataset FRESHDESK exists
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):19 Assessing whether migration is necessary for stream TIME_ENTRIES
2024-05-22 23:08:11 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):44 Checking whether v1 raw table _airbyte_raw_time_entries in dataset FRESHDESK exists
2024-05-22 23:08:12 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:12 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:12 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:12 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:12 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:12 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:12 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:13 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:13 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:13 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:13 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:13 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:13 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:13 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):54 Migration Info: Required for Sync mode: true, No existing v2 raw tables: false, A v1 raw table exists: false
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):31 No Migration Required for stream: ROLES
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):54 Migration Info: Required for Sync mode: true, No existing v2 raw tables: false, A v1 raw table exists: false
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):31 No Migration Required for stream: PRODUCTS
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):54 Migration Info: Required for Sync mode: false, No existing v2 raw tables: false, A v1 raw table exists: false
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):31 No Migration Required for stream: AGENTS
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):54 Migration Info: Required for Sync mode: true, No existing v2 raw tables: false, A v1 raw table exists: false
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):31 No Migration Required for stream: SLA_POLICIES
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):54 Migration Info: Required for Sync mode: true, No existing v2 raw tables: false, A v1 raw table exists: false
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):31 No Migration Required for stream: TICKETS
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):54 Migration Info: Required for Sync mode: true, No existing v2 raw tables: false, A v1 raw table exists: false
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):31 No Migration Required for stream: COMPANIES
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(shouldMigrate):54 Migration Info: Required for Sync mode: true, No existing v2 raw tables: false, A v1 raw table exists: false
2024-05-22 23:08:14 destination > INFO type-and-dedupe i.a.i.b.d.t.BaseDestinationV1V2Migrator(migrateIfNecessary):31 No Migration Required for stream: TIME_ENTRIES
2024-05-22 23:08:15 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:15 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:15 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:15 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:15 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:15 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:15 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.i.d.s.t.SnowflakeV2TableMigrator(migrateIfNecessary):55 Checking whether upcasing migration is necessary for FRESHDESK.products. Sync mode requires migration: true; existing case-sensitive table exists: false; existing uppercased table does not exist: false
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.i.d.s.t.SnowflakeV2TableMigrator(migrateIfNecessary):55 Checking whether upcasing migration is necessary for FRESHDESK.roles. Sync mode requires migration: true; existing case-sensitive table exists: false; existing uppercased table does not exist: false
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.i.d.s.t.SnowflakeV2TableMigrator(migrateIfNecessary):55 Checking whether upcasing migration is necessary for FRESHDESK.tickets. Sync mode requires migration: true; existing case-sensitive table exists: false; existing uppercased table does not exist: false
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.i.d.s.t.SnowflakeV2TableMigrator(migrateIfNecessary):55 Checking whether upcasing migration is necessary for FRESHDESK.agents. Sync mode requires migration: false; existing case-sensitive table exists: false; existing uppercased table does not exist: false
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.i.d.s.t.SnowflakeV2TableMigrator(migrateIfNecessary):55 Checking whether upcasing migration is necessary for FRESHDESK.sla_policies. Sync mode requires migration: true; existing case-sensitive table exists: false; existing uppercased table does not exist: false
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.i.d.s.t.SnowflakeV2TableMigrator(migrateIfNecessary):55 Checking whether upcasing migration is necessary for FRESHDESK.companies. Sync mode requires migration: true; existing case-sensitive table exists: false; existing uppercased table does not exist: false
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:16 destination > INFO type-and-dedupe i.a.i.d.s.t.SnowflakeV2TableMigrator(migrateIfNecessary):55 Checking whether upcasing migration is necessary for FRESHDESK.time_entries. Sync mode requires migration: true; existing case-sensitive table exists: false; existing uppercased table does not exist: false
2024-05-22 23:08:17 destination > INFO main o.j.t.JooqLogger(lambda$info$5):355 

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@  @@        @@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@        @@@@@@@@@@
@@@@@@@@@@@@@@@@  @@  @@    @@@@@@@@@@
@@@@@@@@@@  @@@@  @@  @@    @@@@@@@@@@
@@@@@@@@@@        @@        @@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@        @@        @@@@@@@@@@
@@@@@@@@@@    @@  @@  @@@@  @@@@@@@@@@
@@@@@@@@@@    @@  @@  @@@@  @@@@@@@@@@
@@@@@@@@@@        @@  @  @  @@@@@@@@@@
@@@@@@@@@@        @@        @@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@  @@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@  Thank you for using jOOQ 3.16.23 (Build date: 2023-12-20T14:13:48Z)

2024-05-22 23:08:17 destination > INFO main o.j.t.JooqLogger(lambda$info$5):355 

jOOQ tip of the day: You don't *have to* map SQL results to POJOs if you're consuming XML or JSON in the end. Generate the XML or JSON directly in SQL, instead! https://blog.jooq.org/stop-mapping-stuff-in-your-middleware-use-sqls-xml-or-json-operators-instead/

2024-05-22 23:08:18 destination > INFO main i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:20 destination > INFO main i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:21 destination > INFO main i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:21 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(lambda$getInitialRawTableState$7):136 Retrieving table from Db metadata: airbyte_internal FRESHDESK_raw__stream_sla_policies
2024-05-22 23:08:21 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(lambda$getInitialRawTableState$7):136 Retrieving table from Db metadata: airbyte_internal FRESHDESK_raw__stream_products
2024-05-22 23:08:21 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(lambda$getInitialRawTableState$7):136 Retrieving table from Db metadata: airbyte_internal FRESHDESK_raw__stream_roles
2024-05-22 23:08:22 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(lambda$getInitialRawTableState$7):136 Retrieving table from Db metadata: airbyte_internal FRESHDESK_raw__stream_tickets
2024-05-22 23:08:22 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(lambda$getInitialRawTableState$7):136 Retrieving table from Db metadata: airbyte_internal FRESHDESK_raw__stream_companies
2024-05-22 23:08:22 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(lambda$getInitialRawTableState$7):136 Retrieving table from Db metadata: airbyte_internal FRESHDESK_raw__stream_time_entries
2024-05-22 23:08:23 destination > INFO main i.a.i.d.s.t.SnowflakeDestinationHandler(executeWithinTransaction):387 executing SQL:BEGIN
  BEGIN TRANSACTION;
      IF (EXISTS (SELECT 1 FROM  "airbyte_internal"."_airbyte_destination_state" where (false or ("name" = 'agents' and "namespace" = 'FRESHDESK') or ("name" = 'sla_policies' and "namespace" = 'FRESHDESK') or ("name" = 'products' and "namespace" = 'FRESHDESK') or ("name" = 'roles' and "namespace" = 'FRESHDESK') or ("name" = 'tickets' and "namespace" = 'FRESHDESK') or ("name" = 'companies' and "namespace" = 'FRESHDESK') or ("name" = 'time_entries' and "namespace" = 'FRESHDESK')))) THEN
    delete from "airbyte_internal"."_airbyte_destination_state" where (false or ("name" = 'agents' and "namespace" = 'FRESHDESK') or ("name" = 'sla_policies' and "namespace" = 'FRESHDESK') or ("name" = 'products' and "namespace" = 'FRESHDESK') or ("name" = 'roles' and "namespace" = 'FRESHDESK') or ("name" = 'tickets' and "namespace" = 'FRESHDESK') or ("name" = 'companies' and "namespace" = 'FRESHDESK') or ("name" = 'time_entries' and "namespace" = 'FRESHDESK'));
  END IF
;
    insert into "airbyte_internal"."_airbyte_destination_state" ("name", "namespace", "destination_state", "updated_at") values ('agents', 'FRESHDESK', '{"needsSoftReset":false}', '2024-05-22T23:08:23.124803842Z'), ('sla_policies', 'FRESHDESK', '{"needsSoftReset":false}', '2024-05-22T23:08:23.139245837Z'), ('products', 'FRESHDESK', '{"needsSoftReset":false}', '2024-05-22T23:08:23.139401649Z'), ('roles', 'FRESHDESK', '{"needsSoftReset":false}', '2024-05-22T23:08:23.139544001Z'), ('tickets', 'FRESHDESK', '{"needsSoftReset":false}', '2024-05-22T23:08:23.139664083Z'), ('companies', 'FRESHDESK', '{"needsSoftReset":false}', '2024-05-22T23:08:23.139781374Z'), ('time_entries', 'FRESHDESK', '{"needsSoftReset":false}', '2024-05-22T23:08:23.139937287Z');
  COMMIT;
END;
2024-05-22 23:08:24 destination > INFO main i.a.c.i.d.s.GeneralStagingFunctions(onStartFunction$lambda$0):62 Preparing staging area in destination started for schema airbyte_internal stream agents: target table: FRESHDESK_raw__stream_agents, stage: 2024/05/22/23/E67DEC10-F3EB-4154-BE5A-49E58D1A71FC/
2024-05-22 23:08:24 destination > INFO main i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:24 destination > INFO main i.a.c.i.d.s.GeneralStagingFunctions(onStartFunction$lambda$0):82 Preparing staging area in destination completed for schema airbyte_internal stream agents
2024-05-22 23:08:24 destination > INFO main i.a.c.i.d.s.GeneralStagingFunctions(onStartFunction$lambda$0):62 Preparing staging area in destination started for schema airbyte_internal stream sla_policies: target table: FRESHDESK_raw__stream_sla_policies, stage: 2024/05/22/23/E67DEC10-F3EB-4154-BE5A-49E58D1A71FC/
2024-05-22 23:08:24 destination > INFO main i.a.c.d.j.DefaultJdbcDatabase(unsafeQuery$lambda$6):128 closing connection
2024-05-22 23:08:25 destination > INFO main i.a.c.i.d.s.GeneralStagingFunctions(onStartFunction$lambda$0):82 Preparing staging area in destination completed for schema airbyte_internal stream sla_policies
2024-05-22 23:08:25 destination > INFO main i.a.c.i.d.s.GeneralStagingFunctions(onStartFunction$lambda$0):62 Preparing staging area in destination started for schema airbyte_internal stream products: target table: FRESHDESK_raw__stream_products, stage: 2024/05/22/23/E67DEC10-F3EB-4154-BE5A-49E58D1A71FC/

Contribute

sluo2024 commented 5 months ago

add log as attchement default_workspace_job_39_attempt_3_txt.txt

marcosmarxm commented 4 months ago

@sluo2024 you need the sync finish/failed by it's own. So you can confirm the records were committed properly