airbytehq / airbyte

The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
https://airbyte.com
Other
15.4k stars 3.97k forks source link

Unable to authenticate Azure blob with Databricks Lakehouse connector #19011

Closed anshul-cached closed 2 weeks ago

anshul-cached commented 1 year ago
## Environment - **Airbyte version**: 0.40.18 - **OS Version / Instance**: Windows 7/10 - **Deployment**: Docker - **Destination Connector and version**: DatalakeHouse

Current Behavior

Expected Behavior

The connector should be able to connect to the service as both the databricks details and Azure blob SAS token are correct and working fine out of the airbyte environment

Logs

image
jtv8 commented 1 year ago

Experiencing the same issue - Databricks Lakehouse connector 0.3.1. The SAS is valid and has every possible permission on the container (Read, Add, Create, Write, Delete, List, Immutable Storage). I'm at a loss here - it's unclear what else I could try.

jtv8 commented 1 year ago

OK, I got this to work. This is a documentation issue. It needs to be an account level SAS rather than a container level SAS. That's misleading given that it asks you for the container name right before!

It would also be helpful to have a list of the specific permissions the SAS should have so we don't have to guess (or violate the least privilege principle by assigning it everything!)

evantahler commented 2 weeks ago

Closing - this destination has been updated significantly since this issue was posted.