relferreira / metabase-sparksql-databricks-driver

GNU Affero General Public License v3.0
31 stars 32 forks source link

Handle connection attempts to unsupported databases #36

Closed metamben closed 8 months ago

metamben commented 8 months ago

This PR is to fix #23.

As described in the javadocs, when connect is called on a driver, it should return nil if it realizes it is the wrong kind of driver to connect to the given URL. With this change, when this happens, nil is passed on to the driver manager so that it can try other drivers.

slvrtrn commented 8 months ago

@relferreira, can you please release this one ASAP? Currently, #23 is confirmed to affect Redshift and ClickHouse drivers and potentially other sql_jdbc drivers.

harlemmuniz commented 8 months ago

@slvrtrn confirmed here the error is happening with SQL Server (sql_jdbc driver) connections.

mshustov commented 7 months ago

@relferreira is it possible to release a patch version with the fix? As @slvrtrn mentioned

https://github.com/relferreira/metabase-sparksql-databricks-driver/issues/23 is confirmed to affect Redshift and ClickHouse drivers and potentially other sql_jdbc drivers.

camsaul commented 7 months ago

If anyone wants to try this out, I built the latest version of the driver from the repo today using this script (adapted from the sample Sudoku driver build script)

#! /usr/bin/env bash

set -euxo pipefail

METABASE_PATH=/home/cam/metabase
DRIVER_PATH=/home/cam/metabase-sparksql-databricks-driver/

cd "$METABASE_PATH"

clojure \
  -Sdeps "{:aliases {:sparksql-databricks {:extra-deps {com.metabase/sparksql-databricks-driver {:local/root \"$DRIVER_PATH\"}}}}}"  \
  -X:build:sparksql-databricks \
  build-drivers.build-driver/build-driver! \
  "{:driver :sparksql-databricks, :project-dir \"$DRIVER_PATH\", :target-dir \"$DRIVER_PATH/target\"}"

seemed easier than doing it with the Docker image, but you do have to have Clojure installed locally and have Metabase checked out locally (I built this with the release-x.49.x branch of the Metabase repo, since there are some incompatibilities with the current dev 50.x master branch.)

The JAR is in this ZIP file. sparksql-databricks.metabase-driver.zip. Yes I know we're zipping a zip here but GH doesn't let you add .jar files to comments. Probably a good thing that GH discourages JAR uploads TBH. I encourage people here to build the driver themselves rather than trust some random JAR from the internet.

I'm hoping we get another official release of this driver soon