byzer-org / byzer-lang

Byzer (former MLSQL): A low-code open-source programming language for data pipeline, analytics and AI.
https://www.byzer.org
Apache License 2.0
1.83k stars 547 forks source link

byzer server on yarn can not start, java.lang.ClassNotFoundException: org.apache.spark.ps.cluster.PSExecutorPlugin #1782

Closed xiaohei-info closed 2 years ago

xiaohei-info commented 2 years ago
image

byzer version: byzer-lang-2.4.3-2.3.0.1 spark version: 2.11-2.4.7.7.1.7.0-551(Cloudera CDH) scala version: 2.11.12

allwefantasy commented 2 years ago

有两个解决办法。

第一个, byzer 的主包使用 --jars 也带上。 第二个,添加一个启动参数试试:

 -streaming.ps.cluster.enable false

问题原因: Byzer 需要在 Executor侧也做一些初始化,需要对应的类通过 --jars 带上。或者说不初始化这些功能,那么就可以使用第二个方法关掉。

另外,Byzer 官方测试过的是Spark 2.4.3 和 3.1.1。

allwefantasy commented 2 years ago

另外,如果你使用 byzer 自带的启动脚本启动的,能贴一个完整的启动脚本的日志么?

ZhengshuaiPENG commented 2 years ago

另外,如果你使用 byzer 自带的启动脚本启动的,能贴一个完整的启动脚本的日志么?

可以提供 logs/shell.stderr , logs/byzer.out ,logs/byzer.log, 以及 conf/byzer.propertiesconf/byzer.properties.override 来方便排查问题

xiaohei-info commented 2 years ago

另外,如果你使用 byzer 自带的启动脚本启动的,能贴一个完整的启动脚本的日志么?

可以提供 logs/shell.stderr , logs/byzer.out ,logs/byzer.log, 以及 conf/byzer.propertiesconf/byzer.properties.override 来方便排查问题

conf/byzer.properties.override

byzer.server.mode=server streaming.master=yarn streaming.name=Byzer-lang-server streaming.rest=true streaming.thrift=false streaming.platform=spark streaming.spark.service=true streaming.driver.port=9003 streaming.enableHiveSupport=true streaming.datalake.path=./delta spark.driver.memory=2g spark.executor.memory=1g spark.driver.cores=1 spark.executor.cores=1 spark.executor.instances=10

bin/byzer.sh start Starting Byzer engine...

Byzer-lang is checking installation environment, log is at /opt/soft/byzer-lang-2.4.3-2.3.0.1/logs/check-env.out

Checking OS ...................................................[PASS] Checking Java Version ...................................................[PASS] Checking Ports Availability ...................................................[PASS]

Checking environment finished successfully. To check again, run 'bin/check-env.sh' manually.

SPARK_HOME is: /opt/cloudera/parcels/CDH/lib/spark BYZER_HOME is: /opt/soft/byzer-lang-2.4.3-2.3.0.1 BYZER_CONFIG_FILE is: /opt/soft/byzer-lang-2.4.3-2.3.0.1/conf/byzer.properties Starting Byzer engine in server mode...

[Spark Config] --conf spark.kryoserializer.buffer=256k --conf spark.executor.memory=1g --conf spark.driver.memory=2g --conf spark.kryoserializer.buffer.max=1024m --conf spark.sql.hive.thriftServer.singleSession=true --conf spark.master=local[] --conf spark.scheduler.mode=FAIR --conf spark.executor.cores=1 --conf spark.serializer=org.apache.spark.serializer.KryoSerializer --conf spark.executor.instances=10 --conf spark.driver.cores=1 [Byzer Config] -streaming.spark.service true -streaming.driver.port 9003 -streaming.platform spark -streaming.name Byzer-lang-server -streaming.thrift false -streaming.master yarn -streaming.rest true -streaming.datalake.path /warehouse/deltalake -streaming.enableHiveSupport true [Extra Config] /opt/soft/byzer-lang-2.4.3-2.3.0.1/plugin/.jar:/opt/soft/byzer-lang-2.4.3-2.3.0.1/libs/a_guava-28.1.jre.jar:/opt/soft/byzer-lang-2.4.3-2.3.0.1/libs/ansj_seg-5.1.6.jar:/opt/soft/byzer-lang-2.4.3-2.3.0.1/libs/nlp-lang-1.7.8.jar:/opt/soft/byzer-lang-2.4.3-2.3.0.1/main/byzer-lang-2.4.3-2.11-2.3.0.1.jar

Byzer engine is starting. It may take a while. For status, please visit http://127.0.0.1:9003.

You may also check status via: PID:3325, or Log: /opt/soft/byzer-lang-2.4.3-2.3.0.1/logs/byzer-lang.log.

logs/byzer-lang.log 22/05/16 21:21:03 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Requesting driver to remove executor 7449 for reason Container from a bad node: container_e06_1651462646567_0164_01_011189 on host: findataplat-worker01.tg.mt.com. Exit status: 1. Diagnostics: [2022-05-16 21:21:03.581]Exception from container-launch. Container id: container_e06_1651462646567_0164_01_011189 Exit code: 1 Exception message: Launch container failed Shell output: main : command provided 1 main : run as user is hive main : requested yarn user is hive Getting exit code file... Creating script paths... Writing pid file... Writing to tmp file /opt/yarn/nm/nmPrivate/application_1651462646567_0164/container_e06_1651462646567_0164_01_011189/container_e06_1651462646567_0164_01_011189.pid.tmp Writing to cgroup task files... Creating local dirs... Launching container...

[2022-05-16 21:21:03.582]Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : ocal directory at /opt/yarn/nm/usercache/hive/appcache/application_1651462646567_0164/blockmgr-37586980-ff61-4917-94a3-3aab884ef123 22/05/16 21:21:02 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@findataplat-utility.tg.mt.com:30157 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Successfully registered with driver 22/05/16 21:21:03 INFO executor.Executor: Starting executor ID 7449 on host findataplat-worker01.tg.mt.com 22/05/16 21:21:03 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 15269. 22/05/16 21:21:03 INFO netty.NettyBlockTransferService: Server created on findataplat-worker01.tg.mt.com:15269 22/05/16 21:21:03 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManager: external shuffle service port = 7337 22/05/16 21:21:03 INFO storage.BlockManager: Registering executor with local external shuffle service. 22/05/16 21:21:03 INFO client.TransportClientFactory: Successfully created connection to findataplat-worker01.tg.mt.com/10.199.132.216:7337 after 1 ms (0 ms spent in bootstraps) 22/05/16 21:21:03 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 ERROR executor.CoarseGrainedExecutorBackend: Executor self-exiting due to : Unable to create executor due to org.apache.spark.ps.cluster.PSExecutorPlugin java.lang.ClassNotFoundException: org.apache.spark.ps.cluster.PSExecutorPlugin at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.spark.util.Utils$.classForName(Utils.scala:194) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2660) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2658) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2658) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:148) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:147) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:205) at org.apache.spark.executor.Executor.(Executor.scala:147) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:83) at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117) at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205) at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101) at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:221) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 22/05/16 21:21:03 INFO storage.DiskBlockManager: Shutdown hook called 22/05/16 21:21:03 INFO util.ShutdownHookManager: Shutdown hook called

[2022-05-16 21:21:03.583]Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : ocal directory at /opt/yarn/nm/usercache/hive/appcache/application_1651462646567_0164/blockmgr-37586980-ff61-4917-94a3-3aab884ef123 22/05/16 21:21:02 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@findataplat-utility.tg.mt.com:30157 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Successfully registered with driver 22/05/16 21:21:03 INFO executor.Executor: Starting executor ID 7449 on host findataplat-worker01.tg.mt.com 22/05/16 21:21:03 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 15269. 22/05/16 21:21:03 INFO netty.NettyBlockTransferService: Server created on findataplat-worker01.tg.mt.com:15269 22/05/16 21:21:03 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManager: external shuffle service port = 7337 22/05/16 21:21:03 INFO storage.BlockManager: Registering executor with local external shuffle service. 22/05/16 21:21:03 INFO client.TransportClientFactory: Successfully created connection to findataplat-worker01.tg.mt.com/10.199.132.216:7337 after 1 ms (0 ms spent in bootstraps) 22/05/16 21:21:03 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 ERROR executor.CoarseGrainedExecutorBackend: Executor self-exiting due to : Unable to create executor due to org.apache.spark.ps.cluster.PSExecutorPlugin java.lang.ClassNotFoundException: org.apache.spark.ps.cluster.PSExecutorPlugin at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.spark.util.Utils$.classForName(Utils.scala:194) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2660) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2658) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2658) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:148) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:147) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:205) at org.apache.spark.executor.Executor.(Executor.scala:147) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:83) at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117) at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205) at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101) at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:221) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 22/05/16 21:21:03 INFO storage.DiskBlockManager: Shutdown hook called 22/05/16 21:21:03 INFO util.ShutdownHookManager: Shutdown hook called

logs/shell.stderr 22/05/16 21:21:03 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Requesting driver to remove executor 7449 for reason Container from a bad node: container_e06_1651462646567_0164_01_011189 on host: findataplat-worker01.tg.mt.com. Exit status: 1. Diagnostics: [2022-05-16 21:21:03.581]Exception from container-launch. Container id: container_e06_1651462646567_0164_01_011189 Exit code: 1 Exception message: Launch container failed Shell output: main : command provided 1 main : run as user is hive main : requested yarn user is hive Getting exit code file... Creating script paths... Writing pid file... Writing to tmp file /opt/yarn/nm/nmPrivate/application_1651462646567_0164/container_e06_1651462646567_0164_01_011189/container_e06_1651462646567_0164_01_011189.pid.tmp Writing to cgroup task files... Creating local dirs... Launching container...

[2022-05-16 21:21:03.582]Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : ocal directory at /opt/yarn/nm/usercache/hive/appcache/application_1651462646567_0164/blockmgr-37586980-ff61-4917-94a3-3aab884ef123 22/05/16 21:21:02 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@findataplat-utility.tg.mt.com:30157 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Successfully registered with driver 22/05/16 21:21:03 INFO executor.Executor: Starting executor ID 7449 on host findataplat-worker01.tg.mt.com 22/05/16 21:21:03 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 15269. 22/05/16 21:21:03 INFO netty.NettyBlockTransferService: Server created on findataplat-worker01.tg.mt.com:15269 22/05/16 21:21:03 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManager: external shuffle service port = 7337 22/05/16 21:21:03 INFO storage.BlockManager: Registering executor with local external shuffle service. 22/05/16 21:21:03 INFO client.TransportClientFactory: Successfully created connection to findataplat-worker01.tg.mt.com/10.199.132.216:7337 after 1 ms (0 ms spent in bootstraps) 22/05/16 21:21:03 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 ERROR executor.CoarseGrainedExecutorBackend: Executor self-exiting due to : Unable to create executor due to org.apache.spark.ps.cluster.PSExecutorPlugin java.lang.ClassNotFoundException: org.apache.spark.ps.cluster.PSExecutorPlugin at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.spark.util.Utils$.classForName(Utils.scala:194) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2660) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2658) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2658) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:148) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:147) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:205) at org.apache.spark.executor.Executor.(Executor.scala:147) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:83) at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117) at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205) at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101) at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:221) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 22/05/16 21:21:03 INFO storage.DiskBlockManager: Shutdown hook called 22/05/16 21:21:03 INFO util.ShutdownHookManager: Shutdown hook called

[2022-05-16 21:21:03.583]Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : ocal directory at /opt/yarn/nm/usercache/hive/appcache/application_1651462646567_0164/blockmgr-37586980-ff61-4917-94a3-3aab884ef123 22/05/16 21:21:02 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@findataplat-utility.tg.mt.com:30157 22/05/16 21:21:03 INFO executor.CoarseGrainedExecutorBackend: Successfully registered with driver 22/05/16 21:21:03 INFO executor.Executor: Starting executor ID 7449 on host findataplat-worker01.tg.mt.com 22/05/16 21:21:03 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 15269. 22/05/16 21:21:03 INFO netty.NettyBlockTransferService: Server created on findataplat-worker01.tg.mt.com:15269 22/05/16 21:21:03 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 INFO storage.BlockManager: external shuffle service port = 7337 22/05/16 21:21:03 INFO storage.BlockManager: Registering executor with local external shuffle service. 22/05/16 21:21:03 INFO client.TransportClientFactory: Successfully created connection to findataplat-worker01.tg.mt.com/10.199.132.216:7337 after 1 ms (0 ms spent in bootstraps) 22/05/16 21:21:03 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(7449, findataplat-worker01.tg.mt.com, 15269, None) 22/05/16 21:21:03 ERROR executor.CoarseGrainedExecutorBackend: Executor self-exiting due to : Unable to create executor due to org.apache.spark.ps.cluster.PSExecutorPlugin java.lang.ClassNotFoundException: org.apache.spark.ps.cluster.PSExecutorPlugin at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.spark.util.Utils$.classForName(Utils.scala:194) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2660) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2658) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2658) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:148) at org.apache.spark.executor.Executor$$anonfun$6.apply(Executor.scala:147) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:205) at org.apache.spark.executor.Executor.(Executor.scala:147) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:83) at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117) at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205) at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101) at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:221) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 22/05/16 21:21:03 INFO storage.DiskBlockManager: Shutdown hook called 22/05/16 21:21:03 INFO util.ShutdownHookManager: Shutdown hook called

allwefantasy commented 2 years ago

看起来很奇怪;主包应该通过 --jars 戴上了;要不你把在配置文件里加个 streaming.ps.cluster.enable=false 吧

xiaohei-info commented 2 years ago

看起来很奇怪;主包应该通过 --jars 戴上了;要不你把在配置文件里加个 streaming.ps.cluster.enable=false 吧

有其他异常: 22/05/18 19:41:54 INFO Configuration: resource-types.xml not found 22/05/18 19:41:54 INFO ResourceUtils: Unable to find 'resource-types.xml'. 22/05/18 19:41:55 INFO YarnClientImpl: Submitted application application_1651462646567_0207 Exception in thread "main" java.lang.VerifyError: Stack map does not match the one at exception handler 77 Exception Details: Location: com/fasterxml/jackson/databind/deser/std/StdDeserializer._parseDate(Lcom/fasterxml/jackson/core/JsonParser;Lcom/fasterxml/jackson/databind/DeserializationContext;)Ljava/util/Date; @77: astore Reason: Type 'com/fasterxml/jackson/core/JsonParseException' (current frame, stack[0]) is not assignable to 'com/fasterxml/jackson/core/exc/StreamReadException' (stack map, stack[0]) Current Frame: bci: @69 flags: { } locals: { 'com/fasterxml/jackson/databind/deser/std/StdDeserializer', 'com/fasterxml/jackson/core/JsonParser', 'com/fasterxml/jackson/databind/DeserializationContext' } stack: { 'com/fasterxml/jackson/core/JsonParseException' } Stackmap Frame: bci: @77 flags: { } locals: { 'com/fasterxml/jackson/databind/deser/std/StdDeserializer', 'com/fasterxml/jackson/core/JsonParser', 'com/fasterxml/jackson/databind/DeserializationContext' } stack: { 'com/fasterxml/jackson/core/exc/StreamReadException' } Bytecode: 0x0000000: 2bb6 0035 aa00 0000 0000 0081 0000 0003 0x0000010: 0000 000b 0000 007a 0000 0081 0000 0081 0x0000020: 0000 0034 0000 0041 0000 0081 0000 0081 0x0000030: 0000 0081 0000 0071 2a2b b600 11b6 0012 0x0000040: 2cb6 006b b02b b600 4742 a700 223a 052c 0x0000050: 2ab4 0002 2bb6 006e 126f 03bd 0004 b600 0x0000060: 70c0 002d 3a06 1906 b600 4c42 bb00 7159 0x0000070: 21b7 0072 b02a 2cb6 0073 c000 71b0 2a2b 0x0000080: 2cb6 0074 b02c 2ab4 0002 2bb6 0025 c000 0x0000090: 71b0 Exception Handler Table: bci [69, 74] => handler: 77 bci [69, 74] => handler: 77 Stackmap Table: same_frame(@56) same_frame(@69) same_locals_1_stack_item_frame(@77,Object[#367]) append_frame(@108,Long) chop_frame(@117,1) same_frame(@126) same_frame(@133)

at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<init>(ScalaNumberDeserializersModule.scala:48)
at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<clinit>(ScalaNumberDeserializersModule.scala)
at com.fasterxml.jackson.module.scala.deser.ScalaNumberDeserializersModule$class.$init$(ScalaNumberDeserializersModule.scala:60)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.<init>(DefaultScalaModule.scala:18)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<init>(DefaultScalaModule.scala:36)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<clinit>(DefaultScalaModule.scala)
at org.apache.spark.util.JsonProtocol$.<init>(JsonProtocol.scala:60)
at org.apache.spark.util.JsonProtocol$.<clinit>(JsonProtocol.scala)
at org.apache.spark.scheduler.EventLoggingListener$.initEventLog(EventLoggingListener.scala:349)
at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:131)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:533)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2547)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:939)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:930)
at streaming.core.strategy.platform.SparkRuntime.createRuntime(SparkRuntime.scala:185)
at streaming.core.strategy.platform.SparkRuntime.<init>(SparkRuntime.scala:63)
at streaming.core.strategy.platform.SparkRuntime$.getOrCreate(SparkRuntime.scala:330)
at streaming.core.strategy.platform.SparkRuntime.getOrCreate(SparkRuntime.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at streaming.core.strategy.platform.PlatformManager$.createRuntimeByPlatform(PlatformManager.scala:237)
at streaming.core.strategy.platform.PlatformManager$.getRuntime(PlatformManager.scala:252)
at streaming.core.strategy.platform.PlatformManager.run(PlatformManager.scala:117)
at streaming.core.StreamingApp$.main(StreamingApp.scala:45)
at streaming.core.StreamingApp.main(StreamingApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:847)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:922)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:931)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
xiaohei-info commented 2 years ago

看起来很奇怪;主包应该通过 --jars 戴上了;要不你把在配置文件里加个 streaming.ps.cluster.enable=false 吧

有其他异常: 22/05/18 19:41:54 INFO Configuration: resource-types.xml not found 22/05/18 19:41:54 INFO ResourceUtils: Unable to find 'resource-types.xml'. 22/05/18 19:41:55 INFO YarnClientImpl: Submitted application application_1651462646567_0207 Exception in thread "main" java.lang.VerifyError: Stack map does not match the one at exception handler 77 Exception Details: Location: com/fasterxml/jackson/databind/deser/std/StdDeserializer._parseDate(Lcom/fasterxml/jackson/core/JsonParser;Lcom/fasterxml/jackson/databind/DeserializationContext;)Ljava/util/Date; @77: astore Reason: Type 'com/fasterxml/jackson/core/JsonParseException' (current frame, stack[0]) is not assignable to 'com/fasterxml/jackson/core/exc/StreamReadException' (stack map, stack[0]) Current Frame: bci: @69 flags: { } locals: { 'com/fasterxml/jackson/databind/deser/std/StdDeserializer', 'com/fasterxml/jackson/core/JsonParser', 'com/fasterxml/jackson/databind/DeserializationContext' } stack: { 'com/fasterxml/jackson/core/JsonParseException' } Stackmap Frame: bci: @77 flags: { } locals: { 'com/fasterxml/jackson/databind/deser/std/StdDeserializer', 'com/fasterxml/jackson/core/JsonParser', 'com/fasterxml/jackson/databind/DeserializationContext' } stack: { 'com/fasterxml/jackson/core/exc/StreamReadException' } Bytecode: 0x0000000: 2bb6 0035 aa00 0000 0000 0081 0000 0003 0x0000010: 0000 000b 0000 007a 0000 0081 0000 0081 0x0000020: 0000 0034 0000 0041 0000 0081 0000 0081 0x0000030: 0000 0081 0000 0071 2a2b b600 11b6 0012 0x0000040: 2cb6 006b b02b b600 4742 a700 223a 052c 0x0000050: 2ab4 0002 2bb6 006e 126f 03bd 0004 b600 0x0000060: 70c0 002d 3a06 1906 b600 4c42 bb00 7159 0x0000070: 21b7 0072 b02a 2cb6 0073 c000 71b0 2a2b 0x0000080: 2cb6 0074 b02c 2ab4 0002 2bb6 0025 c000 0x0000090: 71b0 Exception Handler Table: bci [69, 74] => handler: 77 bci [69, 74] => handler: 77 Stackmap Table: same_frame(@56) same_frame(@69) same_locals_1_stack_item_frame(@77,Object[#367]) append_frame(@108,Long) chop_frame(@117,1) same_frame(@126) same_frame(@133)

at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<init>(ScalaNumberDeserializersModule.scala:48)
at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<clinit>(ScalaNumberDeserializersModule.scala)
at com.fasterxml.jackson.module.scala.deser.ScalaNumberDeserializersModule$class.$init$(ScalaNumberDeserializersModule.scala:60)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.<init>(DefaultScalaModule.scala:18)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<init>(DefaultScalaModule.scala:36)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<clinit>(DefaultScalaModule.scala)
at org.apache.spark.util.JsonProtocol$.<init>(JsonProtocol.scala:60)
at org.apache.spark.util.JsonProtocol$.<clinit>(JsonProtocol.scala)
at org.apache.spark.scheduler.EventLoggingListener$.initEventLog(EventLoggingListener.scala:349)
at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:131)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:533)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2547)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:939)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:930)
at streaming.core.strategy.platform.SparkRuntime.createRuntime(SparkRuntime.scala:185)
at streaming.core.strategy.platform.SparkRuntime.<init>(SparkRuntime.scala:63)
at streaming.core.strategy.platform.SparkRuntime$.getOrCreate(SparkRuntime.scala:330)
at streaming.core.strategy.platform.SparkRuntime.getOrCreate(SparkRuntime.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at streaming.core.strategy.platform.PlatformManager$.createRuntimeByPlatform(PlatformManager.scala:237)
at streaming.core.strategy.platform.PlatformManager$.getRuntime(PlatformManager.scala:252)
at streaming.core.strategy.platform.PlatformManager.run(PlatformManager.scala:117)
at streaming.core.StreamingApp$.main(StreamingApp.scala:45)
at streaming.core.StreamingApp.main(StreamingApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:847)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:922)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:931)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

应该还是spark版本的问题,使用243后启动正常