trinodb / trino

Official repository of Trino, the distributed SQL query engine for big data, formerly known as PrestoSQL (https://trino.io)
https://trino.io
Apache License 2.0
10.49k stars 3.02k forks source link

Delta Lake connector #11296

Open findepi opened 2 years ago

findepi commented 2 years ago
findepi commented 2 years ago

cc @jirassimok @alexjo2144

homar commented 2 years ago

Based on TODOs in code I created following issues related to Delta Lake connector:

findepi commented 2 years ago

@homar thanks! i moved the above list into issue description. Feel free to remove checkboxes from your comment (or the list)

trymzet commented 2 years ago

Do I get it right in that vanilla Databricks is not yet supported? This connector requires using the Thrift schema for Hive connection string (IllegalArgumentException: metastoreUri scheme must be thrift), and AFAIK Databricks only exposes the JDBC connection string (eg jdbc:spark://adb-123456789.5.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/123456789/0427-122644-45iadnd;AuthMech=3;UID=token;PWD=<personal-access-token>). You can only use Thrift if you set up a custom metastore for Databricks.

findepi commented 2 years ago

You can only use Thrift if you set up a custom metastore for Databricks.

Yes. Or, use Glue.

AFAIK Databricks only exposes the JDBC connection string (eg jdbc:spark://adb-123456789.5.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/123456789/0427-122644-45iadnd;AuthMech=3;UID=token;PWD=<personal-access-token>).

We have no plans to connect to Databricks runtime using Databricks JDBC. This would kill most benefits of this connector.

alexjo2144 commented 2 years ago

Here's the Databricks docs for setting up an external HMS or Glue https://docs.databricks.com/data/metastores/index.html. Both of those options are supported.

trymzet commented 2 years ago

@alexjo2144 Thanks. I was researching if we can use dbt with various sources all through Trino (inspired by this video) and it seems that Databricks is doable as well, although integrating directly through dbt-databricks plugin is more straightforward. For future generaitons: using Databricks through dbt-trino plugin requires setting up and maintaining your own Hive instance and creating a global init script to set the config of each cluster to use that Hive. Also, DBFS is not supported with this method.

PragyaJaiswal commented 6 months ago

I would think these should change things: https://www.databricks.com/blog/extending-databricks-unity-catalog-open-apache-hive-metastore-api, and the Trino versions 440 and above seems to have support for integrating with the Databricks HMS API: https://trino.io/docs/current/object-storage/metastores.html#thrift-metastore-configuration-properties. Does that look promising?