apache / linkis

Apache Linkis builds a computation middleware layer to facilitate connection, governance and orchestration between the upper applications and the underlying data engines.
https://linkis.apache.org/
Apache License 2.0
3.3k stars 1.17k forks source link

[Question] Failed to submit sparksql task #3225

Closed kl243818244 closed 1 year ago

kl243818244 commented 2 years ago

Before asking

Your environment

Describe your questions

单独安装 linkis平台和管理页面,完成之后验证基础功能,使用 sh bin/linkis-cli -submitUser ... spark 命令验证失败 包括使用 hive提交的时候也会爆出相同的错误

微信图片_20220903121602

Eureka service list

eg:image

Some logs info or acctch file

stdout.log:


1b3/tmp/blockmgr-1098b279-2df6-4fc0-8c3a-5ecb57b2f5ab
2022-09-03 11:57:10.746 [INFO ] [main                                    ] o.a.s.s.m.MemoryStore (54) [logInfo] - MemoryStore started with capacity 434.4 MB
2022-09-03 11:57:10.813 [INFO ] [main                                    ] o.a.s.SparkEnv (54) [logInfo] - Registering OutputCommitCoordinator
2022-09-03 11:57:11.142 [INFO ] [main                                    ] o.a.s.u.Utils (54) [logInfo] - Successfully started service 'SparkUI' on port 4040.
2022-09-03 11:57:11.333 [INFO ] [main                                    ] o.a.s.u.SparkUI (54) [logInfo] - Bound SparkUI to 0.0.0.0, and started at http://******:4040
2022-09-03 11:57:11.349 [INFO ] [main                                    ] o.a.s.SparkContext (54) [logInfo] - Added JAR file:/appcom/tmp/hadoop/20220903/spark/ff2cebf5-72d7-4ef2-9254-d43ca498c1b3/lib/linkis-engineplugin-spark-1.1.3.jar at spark://******:6476/jars/linkis-engineplugin-spark-1.1.3.jar with timestamp 1662177431347
2022-09-03 11:57:11.433 [WARN ] [main                                    ] o.a.s.s.FairSchedulableBuilder (66) [logWarning] - Fair Scheduler configuration file not found so jobs will be scheduled in FIFO order. To use fair scheduling, configure pools in fairscheduler.xml or set spark.scheduler.allocation.file to a file that contains the configuration.
2022-09-03 11:57:11.439 [INFO ] [main                                    ] o.a.s.s.FairSchedulableBuilder (54) [logInfo] - Created default pool: default, schedulingMode: FIFO, minShare: 0, weight: 1
2022-09-03 11:57:12.634 [INFO ] [main                                    ] o.a.h.y.c.ConfiguredRMFailoverProxyProvider (100) [performFailover] - Failing over to rm2
2022-09-03 11:57:12.673 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Requesting a new application from cluster with 6 NodeManagers
2022-09-03 11:57:12.755 [INFO ] [main                                    ] o.a.h.c.Configuration (2757) [getConfResourceAsInputStream] - found resource resource-types.xml at file:/etc/hadoop/3.1.4.0-315/0/resource-types.xml
2022-09-03 11:57:12.784 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Verifying our application has not requested more than the maximum memory capability of the cluster (73728 MB per container)
2022-09-03 11:57:12.786 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Will allocate AM container, with 896 MB memory including 384 MB overhead
2022-09-03 11:57:12.787 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Setting up container launch context for our AM
2022-09-03 11:57:12.795 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Setting up the launch environment for our AM container
2022-09-03 11:57:12.806 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Preparing resources for our AM container
2022-09-03 11:57:14.556 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Use hdfs cache file as spark.yarn.archive for HDP, hdfsCacheFile:hdfs://******/hdp/apps/3.1.4.0-315/spark2/spark2-hdp-yarn-archive.tar.gz
2022-09-03 11:57:14.563 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Source and destination file systems are the same. Not copying hdfs://******/hdp/apps/3.1.4.0-315/spark2/spark2-hdp-yarn-archive.tar.gz
2022-09-03 11:57:14.672 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Distribute hdfs cache file as spark.sql.hive.metastore.jars for HDP, hdfsCacheFile:hdfs://******/hdp/apps/3.1.4.0-315/spark2/spark2-hdp-hive-archive.tar.gz
2022-09-03 11:57:14.673 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Source and destination file systems are the same. Not copying hdfs://******/hdp/apps/3.1.4.0-315/spark2/spark2-hdp-hive-archive.tar.gz
2022-09-03 11:57:14.690 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Source and destination file systems are the same. Not copying hdfs://******/home/spark_conf/hive-site.xml
2022-09-03 11:57:14.713 [WARN ] [main                                    ] o.a.s.d.y.Client (66) [logWarning] - Same path resource hdfs://******/home/spark_conf/hive-site.xml added multiple times to distributed cache.
2022-09-03 11:57:14.723 [INFO ] [main                                    ] o.a.s.d.y.Client (54) [logInfo] - Deleted staging directory hdfs://******/user/hadoop/.sparkStaging/application_1661773015716_0041
2022-09-03 11:57:14.726 [ERROR] [main                                    ] o.a.s.SparkContext (91) [logError] - Error initializing SparkContext. java.lang.IllegalArgumentException: Attempt to add (hdfs://******/home/spark_conf/hive-site.xml) multiple times to the distributed cache.
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:500) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2498) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createSparkSession(SparkEngineConnFactory.scala:117) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
:
2022-09-03 11:57:14.752 [INFO ] [main                                    ] o.a.s.u.SparkUI (54) [logInfo] - Stopped Spark web UI at http://******:4040
2022-09-03 11:57:14.773 [WARN ] [dispatcher-event-loop-9                 ] o.a.s.s.c.YarnSchedulerBackend$YarnSchedulerEndpoint (66) [logWarning] - Attempted to request executors before the AM has registered!
2022-09-03 11:57:14.781 [INFO ] [main                                    ] o.a.s.s.c.YarnClientSchedulerBackend (54) [logInfo] - Stopped
2022-09-03 11:57:14.797 [INFO ] [dispatcher-event-loop-11                ] o.a.s.MapOutputTrackerMasterEndpoint (54) [logInfo] - MapOutputTrackerMasterEndpoint stopped!
2022-09-03 11:57:14.810 [INFO ] [main                                    ] o.a.s.s.m.MemoryStore (54) [logInfo] - MemoryStore cleared
2022-09-03 11:57:14.812 [INFO ] [main                                    ] o.a.s.s.BlockManager (54) [logInfo] - BlockManager stopped
2022-09-03 11:57:14.832 [INFO ] [main                                    ] o.a.s.s.BlockManagerMaster (54) [logInfo] - BlockManagerMaster stopped
2022-09-03 11:57:14.834 [WARN ] [main                                    ] o.a.s.m.MetricsSystem (66) [logWarning] - Stopping a MetricsSystem that is not running
2022-09-03 11:57:14.850 [INFO ] [dispatcher-event-loop-16                ] o.a.s.s.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint (54) [logInfo] - OutputCommitCoordinator stopped!
2022-09-03 11:57:14.876 [INFO ] [main                                    ] o.a.s.SparkContext (54) [logInfo] - Successfully stopped SparkContext
2022-09-03 11:57:14.877 [ERROR] [main                                    ] o.a.l.e.l.EngineConnServer$ (58) [error] - EngineConnServer Start Failed. java.lang.IllegalArgumentException: Attempt to add (hdfs://******/home/spark_conf/hive-site.xml) multiple times to the distributed cache.
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:500) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2498) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createSparkSession(SparkEngineConnFactory.scala:117) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConnSession(SparkEngineConnFactory.scala:74) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
        at org.apache.linkis.manager.engineplugin.common.creation.AbstractEngineConnFactory$class.createEngineConn(EngineConnFactory.scala:48) ~[linkis-engineconn-plugin-core-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConn(SparkEngineConnFactory.scala:42) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineconn.core.engineconn.DefaultEngineConnManager.createEngineConn(EngineConnManager.scala:45) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineconn.launch.EngineConnServer$.main(EngineConnServer.scala:64) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineconn.launch.EngineConnServer.main(EngineConnServer.scala) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_202]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_202]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202]
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]

2022-09-03 11:57:14.891 [ERROR] [main                                    ] o.a.l.e.c.s.EngineConnAfterStartCallback (46) [callback] - protocol will send to em: EngineConnStatusCallback(ServiceInstance(linkis-cg-engineconn, ******:8681),ff2cebf5-72d7-4ef2-9254-d43ca498c1b3,Failed,ServiceInstance(linkis-cg-engineconn, ******:8681): log dir: /appcom/tmp/hadoop/20220903/spark/ff2cebf5-72d7-4ef2-9254-d43ca498c1b3/logs,IllegalArgumentException: Attempt to add (hdfs://******/home/spark_conf/hive-site.xml) multiple times to the distributed cache.)
2022-09-03 11:57:14.912 [ERROR] [main                                    ] o.a.l.e.c.e.h.ComputationEngineConnHook (58) [error] - EngineConnSever start failed! now exit. java.lang.IllegalArgumentException: Attempt to add (hdfs://******/home/spark_conf/hive-site.xml) multiple times to the distributed cache.
:
2022-09-03 11:57:14.891 [ERROR] [main                                    ] o.a.l.e.c.s.EngineConnAfterStartCallback (46) [callback] - protocol will send to em: EngineConnStatusCallback(ServiceInstance(linkis-cg-
engineconn, ******:8681),ff2cebf5-72d7-4ef2-9254-d43ca498c1b3,Failed,ServiceInstance(linkis-cg-engineconn, ******:8681): log dir: /appcom/tmp/hadoop/20220903/spark/ff2cebf5-72d7-4ef2-9254-d43ca
498c1b3/logs,IllegalArgumentException: Attempt to add (hdfs://******/home/spark_conf/hive-site.xml) multiple times to the distributed cache.)
2022-09-03 11:57:14.912 [ERROR] [main                                    ] o.a.l.e.c.e.h.ComputationEngineConnHook (58) [error] - EngineConnSever start failed! now exit. java.lang.IllegalArgumentException: Attem
pt to add (hdfs://******/home/spark_conf/hive-site.xml) multiple times to the distributed cache.
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:500) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2498) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createSparkSession(SparkEngineConnFactory.scala:117) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConnSession(SparkEngineConnFactory.scala:74) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
        at org.apache.linkis.manager.engineplugin.common.creation.AbstractEngineConnFactory$class.createEngineConn(EngineConnFactory.scala:48) ~[linkis-engineconn-plugin-core-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConn(SparkEngineConnFactory.scala:42) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineconn.core.engineconn.DefaultEngineConnManager.createEngineConn(EngineConnManager.scala:45) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineconn.launch.EngineConnServer$.main(EngineConnServer.scala:64) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
        at org.apache.linkis.engineconn.launch.EngineConnServer.main(EngineConnServer.scala) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_202]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_202]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202]
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]

2022-09-03 11:57:14.917 [ERROR] [main                                    ] o.a.l.e.c.h.ShutdownHook (58) [error] - process exit reason: java.lang.IllegalArgumentException: Attempt to add (hdfs://******/home/
spark_conf/hive-site.xml) multiple times to the distributed cache.
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
        at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]

log file:

github-actions[bot] commented 2 years ago

:blush: Welcome to the Apache Linkis (incubating) community!! We are glad that you are contributing by opening this issue.

Please make sure to include all the relevant context. We will be here shortly.

If you are interested in contributing to our website project, please let us know! You can check out our contributing guide on :point_right: How to Participate in Project Contribution.

WeChat Group:

image Mailing Lists: name description Subscribe Unsubscribe archive
dev@linkis.apache.org community activity information subscribe unsubscribe archive
casionone commented 1 year ago

This question has not been answered for a long time, I will close it first, if necessary, you can re-open it.