Open jiefzz opened 2 years ago
hi @jiefzz, seatunnel doesn't support spark.3.x currently.
thanks reply, @leo65535
We update local copy's dependency version like:
diff --git a/pom.xml b/pom.xml
index 54e46b48..e3eb341d 100644
--- a/pom.xml
+++ b/pom.xml
@@ -63,12 +63,12 @@
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
- <scala.version>2.11.8</scala.version>
- <scala.binary.version>2.11</scala.binary.version>
+ <scala.version>2.12.10</scala.version>
+ <scala.binary.version>2.12</scala.binary.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
- <spark.version>2.4.0</spark.version>
- <spark.binary.version>2.4</spark.binary.version>
+ <spark.version>3.0.1</spark.version>
+ <spark.binary.version>3.0</spark.binary.version>
<neo4j.connector.spark.version>4.1.0</neo4j.connector.spark.version>
<flink.version>1.13.5</flink.version>
<hudi.version>0.10.0</hudi.version>
@@ -98,11 +98,11 @@
<flink-shaded-hadoop-2.version>2.7.5-7.0</flink-shaded-hadoop-2.version>
<parquet-avro.version>1.10.0</parquet-avro.version>
<transport.version>6.3.1</transport.version>
- <elasticsearch-spark.version>6.8.3</elasticsearch-spark.version>
+<!-- <elasticsearch-spark.version>6.8.3</elasticsearch-spark.version>-->
<clickhouse-jdbc.version>0.2</clickhouse-jdbc.version>
<hbase-spark.version>1.0.0</hbase-spark.version>
<kudu-spark.version>1.7.0</kudu-spark.version>
- <mongo-spark.version>2.2.0</mongo-spark.version>
+ <mongo-spark.version>2.4.1</mongo-spark.version>
<spark-redis.version>2.6.0</spark-redis.version>
<commons-lang3.version>3.4</commons-lang3.version>
<maven-assembly-plugin.version>2.4</maven-assembly-plugin.version>
it actually work on a demo config file patse above (select from hive, no thansforms action, use the console sink to print), it works, but i don't know it is this java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
has any side effect or not.
Can you give me some tips?
Search before asking
What happened
While the driver is lunched, it will show me an exception, like the paste area. It seems cause by the use different spark version between my environment and build in with seatunnel package
SeaTunnel Version
v2.0.5 build on branch dev (hotspot jdk1.8.0_202)
SeaTunnel Config
Running Command
Error Exception
Flink or Spark Version
Java or Scala Version
No response
Screenshots
I connot paste a photo in my company, sorry
Are you willing to submit PR?
Code of Conduct