The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
Hello, I've encountered two issues with the inventory stream while using the source-amazon-seller-partner:
1 The stream GET_FBA_MYI_ALL_INVENTORY_DATA is missing from the source. Is there a plan to integrate this stream into the source anytime soon? Please see the image below.
2 The stream GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA is encountering an error during data synchronization. This has resulted in an inability to retrieve data. The error log can be found in the "Relevant log output".
Relevant log output
2024-01-08 06:22:33 platform > Checking if airbyte/normalization-clickhouse:0.4.3 exists...
2024-01-08 06:22:33 platform > airbyte/normalization-clickhouse:0.4.3 was found locally.
2024-01-08 06:22:33 platform > Creating docker container = normalization-clickhouse-normalize-176-0-ivfsw with resources io.airbyte.config.ResourceRequirements@54ae32b3[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-01-08 06:22:33 platform > Preparing command: docker run --rm --init -i -w /data/176/0/normalize --log-driver none --name normalization-clickhouse-normalize-176-0-ivfsw --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.50.35 airbyte/normalization-clickhouse:0.4.3 run --integration-type clickhouse --config destination_config.json --catalog destination_catalog.json
2024-01-08 06:22:34 normalization > Running: transform-config --config destination_config.json --integration-type clickhouse --out /data/176/0/normalize
2024-01-08 06:22:35 normalization > Namespace(config='destination_config.json', integration_type=<destinationtype.clickhouse:>, out='/data/176/0/normalize')
2024-01-08 06:22:35 normalization > transform_clickhouse
2024-01-08 06:22:35 normalization > Running: transform-catalog --integration-type clickhouse --profile-config-dir /data/176/0/normalize --catalog destination_catalog.json --out /data/176/0/normalize/models/generated/ --json-column _airbyte_data
2024-01-08 06:22:36 normalization > Processing destination_catalog.json...
2024-01-08 06:22:36 normalization > Truncating amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA (#56) to amz_spapi_snb_na_GET_RESSED_INVENTORY_DATA (#43)
2024-01-08 06:22:36 normalization > Truncating amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA (#56) to amz_spapi_snb_na_GET_RESSED_INVENTORY_DATA (#43)
2024-01-08 06:22:36 normalization > Generating airbyte_ctes/airbyte/amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA_ab1.sql from amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA
2024-01-08 06:22:36 normalization > Truncating amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA (#56) to amz_spapi_snb_na_GET_RESSED_INVENTORY_DATA (#43)
2024-01-08 06:22:36 normalization > Generating airbyte_ctes/airbyte/amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA_ab2.sql from amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA
2024-01-08 06:22:36 normalization > Truncating amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA (#56) to amz_spapi_snb_na_GET_RESSED_INVENTORY_DATA (#43)
2024-01-08 06:22:36 normalization > Truncating _airbyte_amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA_hashid (#59) to _airbyte_amz_spapi_s_INVENTORY_DATA_hashid (#43)
2024-01-08 06:22:36 normalization > Truncating amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA (#56) to amz_spapi_snb_na_GET_RESSED_INVENTORY_DATA (#43)
2024-01-08 06:22:36 normalization > Generating airbyte_ctes/airbyte/amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA_ab3.sql from amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA
2024-01-08 06:22:36 normalization > Truncating amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA (#56) to amz_spapi_snb_na_GET_RESSED_INVENTORY_DATA (#43)
2024-01-08 06:22:36 normalization > Truncating _airbyte_amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA_hashid (#59) to _airbyte_amz_spapi_s_INVENTORY_DATA_hashid (#43)
2024-01-08 06:22:36 normalization > Truncating amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA (#56) to amz_spapi_snb_na_GET_RESSED_INVENTORY_DATA (#43)
2024-01-08 06:22:36 normalization > Adding drop table hook for amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA_scd to amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA
2024-01-08 06:22:36 normalization > Generating airbyte_tables/airbyte/amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA.sql from amz_spapi_snb_na_GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA
2024-01-08 06:22:36 normalization > detected no config file for ssh, assuming ssh is off.
2024-01-08 06:22:46 normalization > 06:22:46 Running with dbt=1.4.6
2024-01-08 06:22:46 normalization > 06:22:46 Unable to do partial parsing because saved manifest not found. Starting full parse.
2024-01-08 06:22:50 normalization > 06:22:50 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
2024-01-08 06:22:50 normalization > There are 2 unused configuration paths:
2024-01-08 06:22:50 normalization > - models.airbyte_utils.generated.airbyte_incremental
2024-01-08 06:22:50 normalization > - models.airbyte_utils.generated.airbyte_views
2024-01-08 06:22:50 normalization > 06:22:50 Found 4 models, 0 tests, 0 snapshots, 0 analyses, 753 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics
2024-01-08 06:22:50 normalization > 06:22:50
2024-01-08 06:22:50 normalization > 06:22:50 Concurrency: 1 threads (target='prod')
2024-01-08 06:22:50 normalization > 06:22:50
2024-01-08 06:22:51 normalization > 06:22:51 1 of 1 START sql table model airbyte.amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA ....................................... [RUN]
2024-01-08 06:22:51 normalization > 06:22:51 1 of 1 OK created sql table model airbyte.amz_spapi_snb_na_GET__RESSED_INVENTORY_DATA .................................. [OK in 0.53s]
2024-01-08 06:22:51 normalization > 06:22:51
2024-01-08 06:22:51 normalization > 06:22:51 Finished running 1 table model in 0 hours 0 minutes and 1.36 seconds (1.36s).
2024-01-08 06:22:51 normalization > 06:22:51
2024-01-08 06:22:51 normalization > 06:22:51 Completed successfully
2024-01-08 06:22:51 normalization > 06:22:51
2024-01-08 06:22:51 normalization > 06:22:51 Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
2024-01-08 06:22:52 normalization > Unable to connect ClickHouse Connect C to Numpy API [No module named 'numpy'], falling back to pure Python
2024-01-08 06:22:52 platform > Terminating normalization process...
2024-01-08 06:22:52 platform > Normalization process successfully terminated.
2024-01-08 06:22:52 platform > Normalization executed in 19 seconds for job 176.
2024-01-08 06:22:52 platform > Normalization summary: io.airbyte.config.NormalizationSummary@7c3bb32a[startTime=1704694953396,endTime=1704694972461,failures=[],additionalProperties={}]
2024-01-08 06:22:52 platform >
2024-01-08 06:22:52 platform > ----- END DEFAULT NORMALIZATION -----
2024-01-08 06:22:52 platform >
2024-01-08 06:22:52 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=10, successivePartialFailureLimit=1000, totalPartialFailureLimit=10, successiveCompleteFailures=1, totalCompleteFailures=1, successivePartialFailures=0, totalPartialFailures=0)
Backoff before next attempt: 10 seconds
Connector Name
source-amazon-seller-partner
Connector Version
3.0.1
What step the error happened?
During the sync
Relevant information
Hello, I've encountered two issues with the inventory stream while using the source-amazon-seller-partner:
1 The stream GET_FBA_MYI_ALL_INVENTORY_DATA is missing from the source. Is there a plan to integrate this stream into the source anytime soon? Please see the image below.
2 The stream GET_FBA_MYI_UNSUPPRESSED_INVENTORY_DATA is encountering an error during data synchronization. This has resulted in an inability to retrieve data. The error log can be found in the "Relevant log output".
Relevant log output
Contribute