The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
In the resources/spec.json the parameters "s3_access_key_id" and "s3_secret_access_key" are listed as required and they probably shouldn't anymore.
It uses the same "io.airbyte.cdk.integrations.destination.s3.S3DestinationConfig" object for config which supports using role-based auth instead of an access key.
Probably would need to update this part that creates the config in databricks connector to be like the "official" S3 connector.
The same is the case for Starburst Galaxy. I don't use that one, but it should also be an easy fix to make those parameters optional.
Connector Name
destination-databricks
Connector Version
1.1.0
What step the error happened?
Configuring a new connector
Relevant information
In the resources/spec.json the parameters "s3_access_key_id" and "s3_secret_access_key" are listed as required and they probably shouldn't anymore. It uses the same "io.airbyte.cdk.integrations.destination.s3.S3DestinationConfig" object for config which supports using role-based auth instead of an access key. Probably would need to update this part that creates the config in databricks connector to be like the "official" S3 connector.
The same is the case for Starburst Galaxy. I don't use that one, but it should also be an easy fix to make those parameters optional.
Relevant log output
No response
Contribute