Closed grishick closed 1 year ago
timeboxing to 13 points
This is a documentation issue. It needs to be an account level SAS rather than a container level SAS.
. So most-likely we need to check and modify the documentation which describe permissions setup.Destination Databricks: handle optional and object types in input schema (azure only) https://github.com/airbytehq/airbyte/pull/21238.
Currently we are supporting External storage only so should we implement Managed storage in a scope of moving to Beta?
Still TBD: manually test the connector
- Databricks connector use in-memory buffer. Should we migrate to File-based buffer?
Yes
Currently we are supporting External storage only so should we implement Managed storage in a scope of moving to Beta?
Yes
Current implementation supports S3 and Azure external storages. Should we extend it before moving to Beta? The full list of supported sources can be found here
No. S3 and Azure are OK for Beta.
We need to answer the following questions about our databricks destination connector:
When filing issues that come out of this research, please link them to this epic