Open findepi opened 2 years ago
cc @jirassimok @alexjo2144
Based on TODOs in code I created following issues related to Delta Lake connector:
@homar thanks! i moved the above list into issue description. Feel free to remove checkboxes from your comment (or the list)
Do I get it right in that vanilla Databricks is not yet supported? This connector requires using the Thrift schema for Hive connection string (IllegalArgumentException: metastoreUri scheme must be thrift
), and AFAIK Databricks only exposes the JDBC connection string (eg jdbc:spark://adb-123456789.5.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/123456789/0427-122644-45iadnd;AuthMech=3;UID=token;PWD=<personal-access-token>
). You can only use Thrift if you set up a custom metastore for Databricks.
You can only use Thrift if you set up a custom metastore for Databricks.
Yes. Or, use Glue.
AFAIK Databricks only exposes the JDBC connection string (eg
jdbc:spark://adb-123456789.5.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/123456789/0427-122644-45iadnd;AuthMech=3;UID=token;PWD=<personal-access-token>
).
We have no plans to connect to Databricks runtime using Databricks JDBC. This would kill most benefits of this connector.
Here's the Databricks docs for setting up an external HMS or Glue https://docs.databricks.com/data/metastores/index.html. Both of those options are supported.
@alexjo2144 Thanks. I was researching if we can use dbt with various sources all through Trino (inspired by this video) and it seems that Databricks is doable as well, although integrating directly through dbt-databricks
plugin is more straightforward. For future generaitons: using Databricks through dbt-trino
plugin requires setting up and maintaining your own Hive instance and creating a global init script to set the config of each cluster to use that Hive. Also, DBFS is not supported with this method.
I would think these should change things: https://www.databricks.com/blog/extending-databricks-unity-catalog-open-apache-hive-metastore-api, and the Trino versions 440 and above seems to have support for integrating with the Databricks HMS API: https://trino.io/docs/current/object-storage/metastores.html#thrift-metastore-configuration-properties. Does that look promising?