Closed anshul-cached closed 2 weeks ago
Experiencing the same issue - Databricks Lakehouse connector 0.3.1. The SAS is valid and has every possible permission on the container (Read, Add, Create, Write, Delete, List, Immutable Storage). I'm at a loss here - it's unclear what else I could try.
OK, I got this to work. This is a documentation issue. It needs to be an account level SAS rather than a container level SAS. That's misleading given that it asks you for the container name right before!
It would also be helpful to have a list of the specific permissions the SAS should have so we don't have to guess (or violate the least privilege principle by assigning it everything!)
Closing - this destination has been updated significantly since this issue was posted.
Current Behavior
Expected Behavior
The connector should be able to connect to the service as both the databricks details and Azure blob SAS token are correct and working fine out of the airbyte environment
Logs