airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
15.53k stars 4k forks source link

[source-BambooHR] No such file or directory: '/usr/local/lib/python3.9/site-packages/airbyte_cdk/schemas/custom_reports_stream.json' #39438

Open sbishdatadude opened 3 months ago

sbishdatadude commented 3 months ago

Connector Name

source-BambooHR

Connector Version

0.3.1

What step the error happened?

During the sync

Relevant information

I'm connecting BambooHR to Snowflake using Airbyte Cloud. It started failing June 3rd 2024 with the error [Errno 2] No such file or directory: '/usr/local/lib/python3.9/site-packages/airbyte_cdk/schemas/custom_reports_stream.json'

Relevant log output

2024-06-11 16:51:35 source > Starting syncing SourceBambooHr
2024-06-11 16:51:35 source > Marking stream custom_reports_stream as STARTED
2024-06-11 16:51:35 source > Setting state of SourceBambooHr stream to {}
2024-06-11 16:51:35 source > Syncing stream: custom_reports_stream 
2024-06-11 16:51:35 source > Encountered an exception while reading stream custom_reports_stream
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 135, in read
    yield from self._read_stream(
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 230, in _read_stream
    for record_data_or_message in record_iterator:
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py", line 171, in read
    for record_data_or_message in records:
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/declarative/declarative_stream.py", line 128, in read_records
    yield from self.retriever.read_records(self.get_json_schema(), stream_slice)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/declarative/declarative_stream.py", line 137, in get_json_schema
    return self._schema_loader.get_json_schema()
  File "/airbyte/integration_code/source_bamboo_hr/components.py", line 33, in get_json_schema
    default_schema = self._get_json_schema_from_file()
  File "/airbyte/integration_code/source_bamboo_hr/components.py", line 55, in _get_json_schema_from_file
    return super().get_json_schema()
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/declarative/schema/json_file_schema_loader.py", line 58, in get_json_schema
    raw_json_file = pkgutil.get_data(resource, schema_path)
  File "/usr/local/lib/python3.9/pkgutil.py", line 639, in get_data
    return loader.get_data(resource_name)
  File "<frozen>", line 1039, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.9/site-packages/airbyte_cdk/schemas/custom_reports_stream.json'
2024-06-11 16:51:35 source > Marking stream custom_reports_stream as STOPPED
2024-06-11 16:51:35 source > Finished syncing custom_reports_stream
2024-06-11 16:51:35 source > SourceBambooHr runtimes:
Syncing stream custom_reports_stream 0:00:00.010656
2024-06-11 16:51:35 source > During the sync, the following streams did not sync successfully: custom_reports_stream: AirbyteTracedException("[Errno 2] No such file or directory: '/usr/local/lib/python3.9/site-packages/airbyte_cdk/schemas/custom_reports_stream.json'")
2024-06-11 16:51:35 replication-orchestrator > Stream status TRACE received of status: STARTED for stream custom_reports_stream
2024-06-11 16:51:35 source > None

Contribute

marcosmarxm commented 3 months ago

The migration to low code appears to have disrupted the connector. @sbishdatadude, could you try to fix the problem? You can reach on Slack for assistance.