apache / seatunnel

SeaTunnel is a next-generation super high-performance, distributed, massive data integration tool.
https://seatunnel.apache.org/
Apache License 2.0
7.86k stars 1.77k forks source link

[Bug] [seatunnel-core-spark] ClassNotFoundException throw while run with spark3.0.3 #1284

Open jiefzz opened 2 years ago

jiefzz commented 2 years ago

Search before asking

What happened

While the driver is lunched, it will show me an exception, like the paste area. It seems cause by the use different spark version between my environment and build in with seatunnel package

2022-02-17 17:38:50,472 INFO cluster.YarnClientSchedulerBackend: Application application_1644485376937_0125 has started running.
2022-02-17 17:38:50,481 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 12710.
2022-02-17 17:38:50,481 INFO netty.NettyBlockTransferService: Server created on 10.199.142.10:12710
2022-02-17 17:38:50,483 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2022-02-17 17:38:50,492 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.199.142.10, 12710, None)
2022-02-17 17:38:50,496 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.199.142.10:12710 with 366.3 MiB RAM, BlockManagerId(driver, 10.199.142.10, 12710, None)
2022-02-17 17:38:50,499 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.199.142.10, 12710, None)
2022-02-17 17:38:50,500 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.199.142.10, 12710, None)
2022-02-17 17:38:50,814 INFO ui.ServerInfo: Adding filter to /metrics/json: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
2022-02-17 17:38:50,817 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68b366e2{/metrics/json,null,AVAILABLE,@Spark}
2022-02-17 17:38:50,856 INFO history.SingleEventLogFileWriter: Logging events to hdfs:/user/spark303/applicationHistory/application_1644485376937_0125.lz4.inprogress
2022-02-17 17:38:51,006 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
2022-02-17 17:38:53,519 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 512, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
2022-02-17 17:38:54,296 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.199.198.44:50562) with ID 2
2022-02-17 17:38:54,415 INFO storage.BlockManagerMasterEndpoint: Registering block manager j-w1:35491 with 93.3 MiB RAM, BlockManagerId(2, j-w1, 35491, None)
2022-02-17 17:38:56,237 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.199.199.229:53648) with ID 1
2022-02-17 17:38:56,321 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
2022-02-17 17:38:56,363 INFO storage.BlockManagerMasterEndpoint: Registering block manager j-w2:50684 with 93.3 MiB RAM, BlockManagerId(1, j-w2, 50684, None)
2022-02-17 17:38:56,457 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Kafka could not be instantiated
        at java.util.ServiceLoader.fail(ServiceLoader.java:232)
        at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
        at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
        at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
        at java.util.ArrayList.forEach(ArrayList.java:1257)
        at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
        at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
        at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
        at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
        at org.apache.seatunnel.spark.sink.Kafka.<init>(Kafka.scala:31)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at java.lang.Class.newInstance(Class.java:442)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
        ... 21 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 28 more
2022-02-17 17:38:56,465 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Hive could not be instantiated
        at java.util.ServiceLoader.fail(ServiceLoader.java:232)
        at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
        at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
        at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
        at java.util.ArrayList.forEach(ArrayList.java:1257)
        at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
        at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
        at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
        at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
        at org.apache.seatunnel.spark.sink.Hive.<init>(Hive.scala:29)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at java.lang.Class.newInstance(Class.java:442)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
        ... 21 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 28 more
2022-02-17 17:38:56,467 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Phoenix could not be instantiated
        at java.util.ServiceLoader.fail(ServiceLoader.java:232)
        at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
        at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
        at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
        at java.util.ArrayList.forEach(ArrayList.java:1257)
        at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
        at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
        at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
        at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
        at org.apache.seatunnel.spark.sink.Phoenix.<init>(Phoenix.scala:29)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at java.lang.Class.newInstance(Class.java:442)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
        ... 21 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 28 more
2022-02-17 17:38:56,469 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Redis could not be instantiated
        at java.util.ServiceLoader.fail(ServiceLoader.java:232)
        at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
        at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
        at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
        at java.util.ArrayList.forEach(ArrayList.java:1257)
        at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
        at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
        at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
        at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
        at org.apache.seatunnel.spark.sink.Redis.<init>(Redis.scala:31)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at java.lang.Class.newInstance(Class.java:442)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
        ... 21 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 28 more

SeaTunnel Version

v2.0.5 build on branch dev (hotspot jdk1.8.0_202)

SeaTunnel Config

env {
  spark.driver.host = "10.199.142.10"
  spark.app.name = "SeaTunnel_test_2.0.5_src_n3__testhost"
  spark.executor.instances = 2
  spark.executor.cores = 1
  spark.executor.memory = "512m"
  spark.sql.catalogImplementation = "hive"
}
source {
  hive {
    pre_sql = "select * from t_cicd.import_first__gw_activitylog where dt='2022-02-10'"
    result_table_name = "my_dataset"
  }
}
transform {
  # do nothing
}
sink {
  Console {}
}

Running Command

seatunnel-dist-2.0.5-SNAPSHOT-2.12.10]$ ./bin/start-seatunnel-spark.sh --master yarn --deploy-mode client --config config/test.sql1.conf

Error Exception

2022-02-17 17:38:56,469 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Redis could not be instantiated
        at java.util.ServiceLoader.fail(ServiceLoader.java:232)
        at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
        at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
        at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
        at java.util.ArrayList.forEach(ArrayList.java:1257)
        at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
        at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
        at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
        at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
        at org.apache.seatunnel.spark.sink.Redis.<init>(Redis.scala:31)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at java.lang.Class.newInstance(Class.java:442)
        at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
        ... 21 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 28 more

Flink or Spark Version

Java or Scala Version

No response

Screenshots

I connot paste a photo in my company, sorry

Are you willing to submit PR?

Code of Conduct

xleoken commented 2 years ago

hi @jiefzz, seatunnel doesn't support spark.3.x currently.

jiefzz commented 2 years ago

thanks reply, @leo65535

We update local copy's dependency version like:

diff --git a/pom.xml b/pom.xml
index 54e46b48..e3eb341d 100644
--- a/pom.xml
+++ b/pom.xml
@@ -63,12 +63,12 @@
     <properties>
         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
         <java.version>1.8</java.version>
-        <scala.version>2.11.8</scala.version>
-        <scala.binary.version>2.11</scala.binary.version>
+        <scala.version>2.12.10</scala.version>
+        <scala.binary.version>2.12</scala.binary.version>
         <maven.compiler.source>${java.version}</maven.compiler.source>
         <maven.compiler.target>${java.version}</maven.compiler.target>
-        <spark.version>2.4.0</spark.version>
-        <spark.binary.version>2.4</spark.binary.version>
+        <spark.version>3.0.1</spark.version>
+        <spark.binary.version>3.0</spark.binary.version>
         <neo4j.connector.spark.version>4.1.0</neo4j.connector.spark.version>
         <flink.version>1.13.5</flink.version>
         <hudi.version>0.10.0</hudi.version>
@@ -98,11 +98,11 @@
         <flink-shaded-hadoop-2.version>2.7.5-7.0</flink-shaded-hadoop-2.version>
         <parquet-avro.version>1.10.0</parquet-avro.version>
         <transport.version>6.3.1</transport.version>
-        <elasticsearch-spark.version>6.8.3</elasticsearch-spark.version>
+<!--        <elasticsearch-spark.version>6.8.3</elasticsearch-spark.version>-->
         <clickhouse-jdbc.version>0.2</clickhouse-jdbc.version>
         <hbase-spark.version>1.0.0</hbase-spark.version>
         <kudu-spark.version>1.7.0</kudu-spark.version>
-        <mongo-spark.version>2.2.0</mongo-spark.version>
+        <mongo-spark.version>2.4.1</mongo-spark.version>
         <spark-redis.version>2.6.0</spark-redis.version>
         <commons-lang3.version>3.4</commons-lang3.version>
         <maven-assembly-plugin.version>2.4</maven-assembly-plugin.version>

it actually work on a demo config file patse above (select from hive, no thansforms action, use the console sink to print), it works, but i don't know it is this java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class has any side effect or not.

Can you give me some tips?