Open FishManHome opened 2 years ago
cc @iodone, can you take a look at this?
Yes, I'm following up on this.
I tried to reproduce the problem with the following example:
With Beeline:
SET kyuubi.operation.language=scala;
spark.sql("add jar file:/Users/work/tmp/jar-test/target/spark_to_cassandra_jars-1.0-SNAPSHOT.jar");
import org.apache.spark.sql.{SaveMode, SparkSession};
import com.datastax.spark.connector.cql.CassandraConnectorConf;
import org.apache.spark.sql.cassandra._;
spark.setCassandraConf(CassandraConnectorConf.KeepAliveMillisParam.option(10000));
val writeDf = spark.read.parquet("/Users/work/tmp/test.parquet");
val x = writeDf.limit(10);
result.set(x);
val cassandraMap = Map("table" -> "test", "keyspace" -> "store");
val df = spark.read.format("org.apache.spark.sql.cassandra").options(cassandraMap).load;
val y = df.limit(10);
result.set(y);
writeDf.write.format("org.apache.spark.sql.cassandra").options(cassandraMap).mode(SaveMode.Append).save()
Tested cassandra read and write, everything works well.
cc @FishManHome
Code of Conduct
Search before asking
Describe the bug
Based on #2471 #2475 Excuting spark core spark.sql("add jar path/xxx") will occur ClassNotFound with classloader: org.apache.spark.util.MutableURLClassLoade
My Spark code api
Affects Version(s)
master
Kyuubi Server Log Output
No response
Kyuubi Engine Log Output
Kyuubi Server Configurations
No response
Kyuubi Engine Configurations
No response
Additional context
No response
Are you willing to submit PR?