apache / arrow-adbc

Database connectivity API standard and libraries for Apache Arrow
https://arrow.apache.org/adbc/
Apache License 2.0
378 stars 94 forks source link

Directly read ADBC to Spark Dataframe #1801

Open HaoXuAI opened 6 months ago

HaoXuAI commented 6 months ago

What feature or improvement would you like to see?

Similar to JDBC, something like:

jdbcDF = spark.read \
    .format("adbc") \
    .option("url", "adbc:postgresql") \
    .option("dbtable", "schema.tablename") \
    .option("user", "username") \
    .option("password", "password") \
    .load()

That way help to leverage ADBC in Spark compute environment.

lidavidm commented 6 months ago

I think this should be a Spark feature request?

What I would like to do here is provide a JNI driver that can leverage the better-optimized postgresql/snowflake drivers from Java, though.

HaoXuAI commented 6 months ago

I think this should be a Spark feature request?

What I would like to do here is provide a JNI driver that can leverage the better-optimized postgresql/snowflake drivers from Java, though.

Right, it should be a spark feature. I'm posting here to check if it is a meaningful feature, and someone from the arrow team is already working on it. :) What do you mean by JNI driver?

lidavidm commented 6 months ago

I don't believe anyone is working on this. Best to take it to the Spark community.

The ADBC driver for postgres, snowflake in Java just wraps JDBC. It doesn't provide any benefits. If we had JNI bindings to the C++/Go drivers we might see some performance benefits.

HaoXuAI commented 6 months ago

make sense. let me post it in the Spark repo.

tokoko commented 5 months ago

@HaoXuAI hey, fancy seeing you here 😄 I've started this a while ago and then abandoned it (changed jobs and was no longer using Dremio). Can help you bring it back from the dead if you have a use case.

HaoXuAI commented 5 months ago

Hey @tokoko ! Great to see you here as well. Not a direct use case on work, but thinking about using ADBC in a project to read data on spark. Do you want to directly contribute to spark or keep it a plugin?

tokoko commented 5 months ago

@HaoXuAI My goal at the time was to get it to mostly working condition as a plugin and then contribute, but we can do it either way.

@lidavidm Even with JNI drivers, the adbc java interface itself will still look the same, right? spark data source implementation will be independent of how drivers are implemented.

lidavidm commented 5 months ago

Yes, the idea of JNI would be to implement the same Java-side interface