apache / kyuubi

Apache Kyuubi is a distributed and multi-tenant gateway to provide serverless SQL on data warehouses and lakehouses.
https://kyuubi.apache.org/
Apache License 2.0
2.07k stars 903 forks source link

[Bug] kyuubi loads the Listener to start an error #2891

Open Narcasserun opened 2 years ago

Narcasserun commented 2 years ago

Code of Conduct

Search before asking

Describe the bug

kyuubi loads the Listener to start an error. When I use yarn client mode, it is normal, but cluster mode fails to start.

Affects Version(s)

1.5.1

Kyuubi Server Log Output

09:20:04.816 INFO org.apache.zookeeper.ClientCnxn: Session establishment complete on server HZ3-BD-2204-V2809.lianlianpay-dc.com/10.90.37.115:2181, sessionid = 0x100faf3ff4b0001, negotiated timeout = 60000
09:20:04.817 INFO org.apache.zookeeper.server.ZooKeeperServer: Established session 0x100faf3ff4b0001 with negotiated timeout 60000 for client /10.90.37.115:37874
09:20:04.818 INFO org.apache.curator.framework.state.ConnectionStateManager: State change: CONNECTED
09:20:04.845 INFO org.apache.kyuubi.engine.EngineRef: Launching engine:
/home/dubbo/spark-3.2.1-bin-hadoop2.7/bin/spark-submit \
    --class org.apache.kyuubi.engine.spark.SparkSQLEngine \
    --conf spark.kyuubi.session.engine.idle.timeout=PT30M \
    --conf spark.kyuubi.session.engine.check.interval=PT5M \
    --conf spark.sql.hive.convertMetastoreParquet=false \
    --conf spark.hive.server2.thrift.resultset.default.fetch.size=1000 \
    --conf spark.kyuubi.ha.zookeeper.quorum=HZ3-BD-2204-V280:2181 \
    --conf spark.kyuubi.sparklens.jars.location=/home/dubbo/apache-kyuubi-1.5.1-incubating-bin/external_jars/lens_2.12-1.0.2-SNAPSHOT.jar \
    --conf spark.extraListeners=com.lly.lens.JobListener \
    --conf spark.kyuubi.client.ip=10.90.37.115 \
    --conf spark.yarn.queue=llyt \
    --conf spark.kyuubi.engine.submit.time=1655342404826 \
    --conf spark.app.name=dwd_evt_payout_deposit_di \
    --conf spark.executorEnv.HADOOP_USER_NAME=hdfs \
    --conf spark.driver.memory=4G \
    --conf spark.executor.instances=20 \
    --conf spark.kyuubi.ha.engine.ref.id=964041b0-08c9-4663-9d02-ed5e4001bf54 \
    --conf spark.executorEnv.SPARK_USER=hdfs \
    --conf spark.kyuubi.session.conf.advisor=org.apache.kyuubi.plugin.HSessionConfAdvisor \
    --conf spark.driver.cores=2 \
    --conf spark.kyuubi.ha.zookeeper.auth.type=NONE \
    --conf spark.submit.deployMode=cluster \
    --conf spark.master=yarn \
    --conf spark.yarn.tags=KYUUBI \
    --conf spark.kyuubi.engine.share.level=CONNECTION \
    --conf spark.kyuubi.ha.zookeeper.namespace=/kyuubi_1.5.1-SNAPSHOT_CONNECTION_SPARK_SQL/hdfs/964041b0-08c9-4663-9d02-ed5e4001bf54 \
    --conf spark.executor.memory=15G \
    --conf spark.executor.cores=4 \
    --conf spark.yarn.executor.memoryOverhead=6G \
    --conf spark.memory.fraction=0.8 \
    --conf spark.kyuubi.engine.type=SPARK_SQL \
    --conf spark.executor.memory=12 \
    --conf spark.sql.shuffle.partitions=300 \
    --jars /home/dubbo/apache-kyuubi-1.5.1-incubating-bin/external_jars/lens_2.12-1.0.2-SNAPSHOT.jar \
    --proxy-user hdfs /home/dubbo/apache-kyuubi-1.5.1-incubating-bin/externals/engines/spark/kyuubi-spark-sql-engine_2.12-1.5.1-SNAPSHOT.jar
09:20:04.857 INFO org.apache.kyuubi.engine.ProcBuilder: Logging to /home/dubbo/apache-kyuubi-1.5.1-incubating-bin/work/hdfs/kyuubi-spark-sql-engine.log.2
09:20:24.644 INFO org.apache.curator.framework.imps.CuratorFrameworkImpl: backgroundOperationsLoop exiting
09:20:24.646 INFO org.apache.zookeeper.server.PrepRequestProcessor: Processed session termination for sessionid: 0x100faf3ff4b0001
09:20:24.648 INFO org.apache.zookeeper.ZooKeeper: Session: 0x100faf3ff4b0001 closed
09:20:24.648 INFO org.apache.zookeeper.ClientCnxn: EventThread shut down for session: 0x100faf3ff4b0001
09:20:24.649 INFO org.apache.zookeeper.server.NIOServerCnxn: Closed socket connection for client /10.90.37.115:37874 which had sessionid 0x100faf3ff4b0001
09:20:24.654 INFO org.apache.kyuubi.operation.LaunchEngine: Processing hdfs's query[71bbed2b-6424-4db3-b771-a2eaf8012f8c]: RUNNING_STATE -> ERROR_STATE, statement: LAUNCH_ENGINE, time taken: 19.854 seconds
09:20:24.677 INFO org.apache.kyuubi.server.KyuubiTBinaryFrontendService: Received request of closing SessionHandle [f07458c0-c2f0-4af8-89db-0c6d3e321e64]
09:20:24.679 INFO org.apache.kyuubi.session.KyuubiSessionManager: SessionHandle [f07458c0-c2f0-4af8-89db-0c6d3e321e64] is closed, current opening sessions 0
09:20:24.683 INFO org.apache.kyuubi.operation.LaunchEngine: Processing hdfs's query[71bbed2b-6424-4db3-b771-a2eaf8012f8c]: ERROR_STATE -> CLOSE

Kyuubi Engine Log Output

22/06/16 09:20:18 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:19 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
09:20:24.644 INFO org.apache.curator.framework.imps.CuratorFrameworkImpl: backgroundOperationsLoop exiting
09:20:24.648 INFO org.apache.zookeeper.ZooKeeper: Session: 0x100faf3ff4b0001 closed
09:20:24.648 INFO org.apache.zookeeper.ClientCnxn: EventThread shut down for session: 0x100faf3ff4b0

22/06/16 09:20:15 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:16 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:17 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:18 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:19 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
09:20:24.644 INFO org.apache.curator.framework.imps.CuratorFrameworkImpl: backgroundOperationsLoop exiting
09:20:24.648 INFO org.apache.zookeeper.ZooKeeper: Session: 0x100faf3ff4b0001 closed
09:20:24.648 INFO org.apache.zookeeper.ClientCnxn: EventThread shut down for session: 0x100faf3ff4b0001
09:20:24.654 INFO org.apache.kyuubi.operation.LaunchEngine: Processing hdfs's query[71bbed2b-6424-4db3-b771-a2eaf8012f8c]: RUNNING_STATE -> ERROR_STATE, statement: LAUNCH_ENGINE, time taken: 19.854 seconds
22/06/16 09:20:20 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:21 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:22 INFO Client: Application report for application_1654630917149_16762 (state: ACCEPTED)
22/06/16 09:20:23 INFO Client: Application report for application_1654630917149_16762 (state: FAILED)
22/06/16 09:20:23 INFO Client: 
     client token: N/A
     diagnostics: Application application_1654630917149_16762 failed 2 times due to AM Container for appattempt_1654630917149_16762_000002 exited with  exitCode: 13
Failing this attempt.Diagnostics: [2022-06-16 09:20:23.053]Exception from container-launch.
Container id: container_e82_1654630917149_16762_02_000001
Exit code: 13

[2022-06-16 09:20:23.054]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
e.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:152)
    at org.apache.hadoop.tracing.SpanReceiverHost.get(SpanReceiverHost.java:79)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:634)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
    at org.apache.spark.deploy.yarn.ApplicationMaster.cleanupStagingDir(ApplicationMaster.scala:683)
    at org.apache.spark.deploy.yarn.ApplicationMaster.$anonfun$run$2(ApplicationMaster.scala:267)
    at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
    at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
    at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.util.Try$.apply(Try.scala:213)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
22/06/16 09:20:22 INFO ShutdownHookManager: Shutdown hook called
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data8/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-310f70f7-060b-4c37-9b40-29c68f8737aa
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data2/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-a2725161-0f29-470d-95d4-ad7762057e4b
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data3/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-49c7dee2-2a77-49d0-b307-73a72cf4e5dc
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data1/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-0dd13a24-0c92-4ec6-a5da-e21de71588e3
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data5/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-51ee1773-aa91-4f56-b43a-697a6a1e02a0
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data6/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-28474654-2043-4f6e-8f4d-1e5481ae9b05
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data7/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-771f67ef-3ac4-41fe-9cef-0daffc55865a
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data11/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-55c32301-2f79-4138-bb0a-1ad037b26df5
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data10/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-b4b76b3a-1207-4f00-a12b-3a569c8e500d
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data12/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-aab310b3-23d5-4ee9-a60b-b95e39313d77
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data9/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-a50fa878-0fe5-48df-9e18-d710167a2cac
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data4/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-505d373b-94f1-4292-8b01-dbbaf301eecc

[2022-06-16 09:20:23.055]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
e.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:152)
    at org.apache.hadoop.tracing.SpanReceiverHost.get(SpanReceiverHost.java:79)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:634)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
    at org.apache.spark.deploy.yarn.ApplicationMaster.cleanupStagingDir(ApplicationMaster.scala:683)
    at org.apache.spark.deploy.yarn.ApplicationMaster.$anonfun$run$2(ApplicationMaster.scala:267)
    at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
    at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
    at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.util.Try$.apply(Try.scala:213)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
22/06/16 09:20:22 INFO ShutdownHookManager: Shutdown hook called
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data8/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-310f70f7-060b-4c37-9b40-29c68f8737aa
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data2/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-a2725161-0f29-470d-95d4-ad7762057e4b
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data3/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-49c7dee2-2a77-49d0-b307-73a72cf4e5dc
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data1/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-0dd13a24-0c92-4ec6-a5da-e21de71588e3
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data5/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-51ee1773-aa91-4f56-b43a-697a6a1e02a0
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data6/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-28474654-2043-4f6e-8f4d-1e5481ae9b05
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data7/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-771f67ef-3ac4-41fe-9cef-0daffc55865a
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data11/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-55c32301-2f79-4138-bb0a-1ad037b26df5
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data10/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-b4b76b3a-1207-4f00-a12b-3a569c8e500d
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data12/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-aab310b3-23d5-4ee9-a60b-b95e39313d77
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data9/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-a50fa878-0fe5-48df-9e18-d710167a2cac
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data4/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-505d373b-94f1-4292-8b01-dbbaf301eecc

For more detailed output, check the application tracking page: http://HZ3-BD-2004-P1392:8088/cluster/app/application_1654630917149_16762 Then click on links to logs of each attempt.
. Failing the application.
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: ll
     start time: 1655342409537
     final status: FAILED
     tracking URL: http://HZ3-BD-2004-P1392:8088/cluster/app/application_1654630917149_16762
     user: hdfs
22/06/16 09:20:23 INFO Client: Deleted staging directory hdfs://hz3bdcrossborder01/user/hdfs/.sparkStaging/application_1654630917149_16762
22/06/16 09:20:23 ERROR Client: Application diagnostics message: Application application_1654630917149_16762 failed 2 times due to AM Container for appattempt_1654630917149_16762_000002 exited with  exitCode: 13
Failing this attempt.Diagnostics: [2022-06-16 09:20:23.053]Exception from container-launch.
Container id: container_e82_1654630917149_16762_02_000001
Exit code: 13

[2022-06-16 09:20:23.054]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
e.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:152)
    at org.apache.hadoop.tracing.SpanReceiverHost.get(SpanReceiverHost.java:79)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:634)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
    at org.apache.spark.deploy.yarn.ApplicationMaster.cleanupStagingDir(ApplicationMaster.scala:683)
    at org.apache.spark.deploy.yarn.ApplicationMaster.$anonfun$run$2(ApplicationMaster.scala:267)
    at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
    at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
    at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.util.Try$.apply(Try.scala:213)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
22/06/16 09:20:22 INFO ShutdownHookManager: Shutdown hook called
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data8/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-310f70f7-060b-4c37-9b40-29c68f8737aa
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data2/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-a2725161-0f29-470d-95d4-ad7762057e4b
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data3/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-49c7dee2-2a77-49d0-b307-73a72cf4e5dc
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data1/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-0dd13a24-0c92-4ec6-a5da-e21de71588e3
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data5/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-51ee1773-aa91-4f56-b43a-697a6a1e02a0
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data6/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-28474654-2043-4f6e-8f4d-1e5481ae9b05
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data7/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-771f67ef-3ac4-41fe-9cef-0daffc55865a
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data11/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-55c32301-2f79-4138-bb0a-1ad037b26df5
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data10/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-b4b76b3a-1207-4f00-a12b-3a569c8e500d
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data12/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-aab310b3-23d5-4ee9-a60b-b95e39313d77
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data9/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-a50fa878-0fe5-48df-9e18-d710167a2cac
22/06/16 09:20:22 INFO ShutdownHookManager: Deleting directory /data4/hadoop/yarn/local/usercache/hdfs/appcache/application_1654630917149_16762/spark-505d373b-94f1-4292-8b01-dbbaf301eecc

Kyuubi Server Configurations

No response

Kyuubi Engine Configurations

No response

Additional context

No response

Are you willing to submit PR?

Narcasserun commented 2 years ago

@yaooqinn can you help me

yaooqinn commented 2 years ago

--conf spark.kyuubi.sparklens.jars.location=/home/dubbo/apache-kyuubi-1.5.1-incubating-bin/external_jars/lens_2.12-1.0.2-SNAPSHOT.jar \

maybe related to this, your jar is unreachable for cluster mode

Narcasserun commented 2 years ago

--conf spark.kyuubi.sparklens.jars.location=/home/dubbo/apache-kyuubi-1.5.1-incubating-bin/external_jars/lens_2.12-1.0.2-SNAPSHOT.jar \

maybe related to this, your jar is unreachable for cluster mode

This is my --jars input

yaooqinn commented 2 years ago

you can check the am log first