Closed martinibach closed 2 years ago
@martin-restack
We just merged this bugfix into master and released a new version of the connector.
Upgrade your connector to version 0.1.5 and get started. To upgrade your connector version, go to the admin panel in the left hand side of the UI, find this connector in the list, and input the latest connector version.
Please let us know if you have any further questions.
Enjoy!
@gaart thank you :-) Unfortunately it still doesn't seem to work. The logs also show, that I upgraded to 0.1.5 and still encounter the issue.
@gaart did you have time to look at this again? :)
@gaart opening the issue again because is not solved yet. I run our integration account and normalization works, but probably the columns @martin-restack has problem, our integration account doesn't have any data.
Hi @martin-restack Could you try to run it again with the latest 0.1.6 version pulled from the docker hub registry? In the logs the schema looks different from what we expect to have there.
@gaart I deployed it now in version 0.30.20-alpha on Kubernetes on Digital Ocean and it's working with 0.1.6 version properly. logs-35-0 (1).txt
Enviroment
Current Behavior
Person table is not synched to BigQuery and normalisation fails with error because of invalid timestamp.
Expected Behavior
Normalisation should be processed and data should be synched to table in BigQuery.
Logs
LOG
``` 2021-09-23 19:06:42 INFO () WorkerRun(call):62 - Executing worker wrapper. Airbyte version: 0.29.21-alpha 2021-09-23 19:06:43 INFO () TemporalAttemptExecution(get):114 - Executing worker wrapper. Airbyte version: 0.29.21-alpha 2021-09-23 19:06:43 WARN () Databases(createPostgresDatabaseWithRetry):58 - Waiting for database to become available... 2021-09-23 19:06:43 INFO () JobsDatabaseInstance(lambda$static$2):45 - Testing if jobs database is ready... 2021-09-23 19:06:44 INFO () Databases(createPostgresDatabaseWithRetry):75 - Database available! 2021-09-23 19:06:44 WARN () JsonMetaSchema(newValidator):338 - Unknown keyword example - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2021-09-23 19:06:44 INFO () DefaultReplicationWorker(run):102 - start sync worker. job id: 18 attempt id: 0 2021-09-23 19:06:44 INFO () DefaultReplicationWorker(run):111 - configured sync modes: {null.persons=full_refresh - append} 2021-09-23 19:06:44 INFO () DefaultAirbyteDestination(start):78 - Running destination... 2021-09-23 19:06:44 INFO () LineGobbler(voidCall):85 - Checking if airbyte/destination-bigquery:0.4.0 exists... 2021-09-23 19:06:44 INFO () LineGobbler(voidCall):85 - airbyte/destination-bigquery:0.4.0 was found locally. 2021-09-23 19:06:44 INFO () DockerProcessFactory(create):146 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/18/0 --network host --log-driver none airbyte/destination-bigquery:0.4.0 write --config destination_config.json --catalog destination_catalog.json 2021-09-23 19:06:44 INFO () LineGobbler(voidCall):85 - Checking if airbyte/source-pipedrive:0.1.3 exists... 2021-09-23 19:06:44 INFO () LineGobbler(voidCall):85 - airbyte/source-pipedrive:0.1.3 was found locally. 2021-09-23 19:06:44 INFO () DockerProcessFactory(create):146 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/18/0 --network host --log-driver none airbyte/source-pipedrive:0.1.3 read --config source_config.json --catalog source_catalog.json 2021-09-23 19:06:44 INFO () DefaultReplicationWorker(lambda$getDestinationOutputRunnable$3):246 - Destination output thread started. 2021-09-23 19:06:44 INFO () DefaultReplicationWorker(run):139 - Waiting for source thread to join. 2021-09-23 19:06:44 INFO () DefaultReplicationWorker(lambda$getReplicationRunnable$2):210 - Replication thread started. 2021-09-23 19:06:45 INFO () DefaultAirbyteStreamFactory(internalLog):110 - Starting syncing SourcePipedrive 2021-09-23 19:06:45 INFO () DefaultAirbyteStreamFactory(internalLog):110 - Syncing stream: persons 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [32mINFO[m i.a.i.d.b.BigQueryDestination(main):356 - {} - starting destination: class io.airbyte.integrations.destination.bigquery.BigQueryDestination 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [32mINFO[m i.a.i.b.IntegrationRunner(run):96 - {} - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [32mINFO[m i.a.i.b.IntegrationCliParser(parseOptions):135 - {} - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [32mINFO[m i.a.i.b.IntegrationRunner(run):100 - {} - Command: WRITE 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [32mINFO[m i.a.i.b.IntegrationRunner(run):101 - {} - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - {} - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - {} - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [32mINFO[m i.a.i.d.b.BigQueryDestination(getLoadingMethod):331 - {} - Selected loading method is set to: STANDARD 2021-09-23 19:06:47 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:47 [32mINFO[m i.a.i.d.b.BigQueryDestination(isKeepFilesInGcs):344 - {} - All tmp files will be removed from GCS when migration is finished 2021-09-23 19:06:49 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:49 [32mINFO[m i.a.i.d.b.BigQueryUtils(createTable):112 - {} - Table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=pipedrive, tableId=_airbyte_tmp_cpj_pipedrive_persons}} created successfully 2021-09-23 19:06:51 INFO () DefaultAirbyteStreamFactory(internalLog):110 - Read 761 records from persons stream 2021-09-23 19:06:51 INFO () DefaultAirbyteStreamFactory(internalLog):110 - Finished syncing SourcePipedrive 2021-09-23 19:06:51 INFO () DefaultReplicationWorker(run):141 - Source thread complete. 2021-09-23 19:06:51 INFO () DefaultReplicationWorker(run):142 - Waiting for destination thread to join. 2021-09-23 19:06:52 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:52 [32mINFO[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 - {} - Airbyte message consumer: succeeded. 2021-09-23 19:06:52 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:52 [32mINFO[m i.a.i.d.b.BigQueryRecordConsumer(close):163 - {} - Started closing all connections 2021-09-23 19:06:53 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:53 [32mINFO[m i.a.i.d.b.BigQueryRecordConsumer(closeNormalBigqueryStreams):278 - {} - Waiting for jobs to be finished/closed 2021-09-23 19:06:55 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:55 [32mINFO[m i.a.i.d.b.BigQueryRecordConsumer(closeNormalBigqueryStreams):295 - {} - Migration finished with no explicit errors. Copying data from tmp tables to permanent 2021-09-23 19:06:56 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:56 [32mINFO[m i.a.i.d.b.BigQueryRecordConsumer(copyTable):359 - {} - successfully copied tmp table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=pipedrive, tableId=_airbyte_tmp_cpj_pipedrive_persons}} to final table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=pipedrive, tableId=_airbyte_raw_pipedrive_persons}} 2021-09-23 19:06:56 INFO () JsonSchemaValidator(test):76 - JSON schema validation failed. errors: $: null found, object expected 2021-09-23 19:06:56 ERROR () DefaultAirbyteStreamFactory(lambda$create$1):83 - Validation failed: null 2021-09-23 19:06:56 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:56 [32mINFO[m i.a.i.d.b.BigQueryRecordConsumer(closeNormalBigqueryStreams):307 - {} - Removing tmp tables... 2021-09-23 19:06:56 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:56 [32mINFO[m i.a.i.d.b.BigQueryRecordConsumer(closeNormalBigqueryStreams):309 - {} - Finishing destination process...completed 2021-09-23 19:06:56 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:56 [32mINFO[m i.a.i.b.IntegrationRunner(run):153 - {} - Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2021-09-23 19:06:56 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-09-23 19:06:56 [32mINFO[m i.a.i.d.b.BigQueryDestination(main):358 - {} - completed destination: class io.airbyte.integrations.destination.bigquery.BigQueryDestination 2021-09-23 19:06:56 INFO () DefaultReplicationWorker(run):144 - Destination thread complete. 2021-09-23 19:06:56 INFO () DefaultReplicationWorker(run):172 - sync summary: io.airbyte.config.ReplicationAttemptSummary@297fad54[status=completed,recordsSynced=761,bytesSynced=1300241,startTime=1632424004303,endTime=1632424016786] 2021-09-23 19:06:56 INFO () DefaultReplicationWorker(run):181 - Source did not output any state messages 2021-09-23 19:06:56 WARN () DefaultReplicationWorker(run):192 - State capture: No state retained. 2021-09-23 19:06:56 INFO () TemporalAttemptExecution(get):135 - Stopping cancellation check scheduling... 2021-09-23 19:06:56 INFO () SyncWorkflow$ReplicationActivityImpl(replicate):184 - sync summary: io.airbyte.config.StandardSyncOutput@6242b963[standardSyncSummary=io.airbyte.config.StandardSyncSummary@1f6fd0eb[status=completed,recordsSynced=761,bytesSynced=1300241,startTime=1632424004303,endTime=1632424016786],state=Steps to Reproduce
Are you willing to submit a PR?
No