airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
15.88k stars 4.07k forks source link

[destination-databricks] utilizes S3, but still list access key as required even though they aren't (IRSA) #32684

Closed johnsmclay closed 1 month ago

johnsmclay commented 11 months ago

Connector Name

destination-databricks

Connector Version

1.1.0

What step the error happened?

Configuring a new connector

Relevant information

In the resources/spec.json the parameters "s3_access_key_id" and "s3_secret_access_key" are listed as required and they probably shouldn't anymore. It uses the same "io.airbyte.cdk.integrations.destination.s3.S3DestinationConfig" object for config which supports using role-based auth instead of an access key. Probably would need to update this part that creates the config in databricks connector to be like the "official" S3 connector.

The same is the case for Starburst Galaxy. I don't use that one, but it should also be an easy fix to make those parameters optional.

Relevant log output

No response

Contribute

evantahler commented 1 month ago

Closing - we've moved to Unity Catalog in the destination