uber-common / jvm-profiler

JVM Profiler Sending Metrics to Kafka, Console Output or Custom Reporter
Other
1.78k stars 342 forks source link

Error opening zip file or JAR manifest missing : jvm-profiler-1.0.0.jar #74

Closed foxinmy closed 4 years ago

foxinmy commented 4 years ago

not working on spark

Container: container_e300_1588140516225_2267_01_000003 on sz-5-centos44_8041

LogType:stderr Log Upload Time:Fri May 22 15:36:26 +0800 2020 LogLength:72 Log Contents: Error opening zip file or JAR manifest missing : jvm-profiler-1.0.0.jar

LogType:stdout Log Upload Time:Fri May 22 15:36:26 +0800 2020 LogLength:84 Log Contents: Error occurred during initialization of VM agent library failed to init: instrument

spark-default.conf

spark.jars=hdfs://my_domain/DW/app/jars/jvm-profiler-1.0.0.jar spark.executor.extraJavaOptions=-javaagent:jvm-profiler-1.0.0.jar=reporter=com.uber.profiling.reporters.KafkaOutputReporter,metricInterval=5000,brokerList=10.201.5.57:9092,topicPrefix=monitor.spark.

liangchen917 commented 4 years ago

-javaagent:jvm-profiler-1.0.0.jar改为绝对路径

foxinmy commented 4 years ago

-javaagent:jvm-profiler-1.0.0.jar改为绝对路径

绝对路径也试过不行啊,是不是要把jvm-profiler-1.0.0.jar复制到每个节点上面?

看文档描述只需要上传到hdfs,然后使用spark.jars配置即可,但我这就是报错= =

liangchen917 commented 4 years ago

-javaagent:jvm-profiler-1.0.0.jar改为绝对路径

绝对路径也试过不行啊,是不是要把jvm-profiler-1.0.0.jar复制到每个节点上面?

看文档描述只需要上传到hdfs,然后使用spark.jars配置即可,但我这就是报错= =

最开始和你的报错一样,我先尝试放弃使用hdfs,将jar包传到各个节点,同样报错。 后来修改spark.jars的路径为一个不存在路径,仍然报同样的错,就锁定在-javaagent配置,将-javaagent的jar包路径改成了完整的路径就可以了。

foxinmy commented 4 years ago

-javaagent:jvm-profiler-1.0.0.jar改为绝对路径

绝对路径也试过不行啊,是不是要把jvm-profiler-1.0.0.jar复制到每个节点上面? 看文档描述只需要上传到hdfs,然后使用spark.jars配置即可,但我这就是报错= =

最开始和你的报错一样,我先尝试放弃使用hdfs,将jar包传到各个节点,同样报错。 后来修改spark.jars的路径为一个不存在路径,仍然报同样的错,就锁定在-javaagent配置,将-javaagent的jar包路径改成了完整的路径就可以了。

能看看你的配置吗?

pedro93 commented 4 years ago

I have a similar issue, could you please share the configuration that worked for you?

liangchen917 commented 4 years ago

-javaagent:jvm-profiler-1.0.0.jar改为绝对路径

绝对路径也试过不行啊,是不是要把jvm-profiler-1.0.0.jar复制到每个节点上面? 看文档描述只需要上传到hdfs,然后使用spark.jars配置即可,但我这就是报错= =

最开始和你的报错一样,我先尝试放弃使用hdfs,将jar包传到各个节点,同样报错。 后来修改spark.jars的路径为一个不存在路径,仍然报同样的错,就锁定在-javaagent配置,将-javaagent的jar包路径改成了完整的路径就可以了。

能看看你的配置吗?

--conf spark.jars=/opt/jvm-profiler-1.0.0.jar --conf spark.driver.extraJavaOptions='-javaagent:/opt/jvm-profiler-1.0.0.jar=sampleInterval=100,reporter=com.uber.profiling.reporters.InfluxDBOutputReporter,influxdb.host=101.12.72.101,influxdb.port=8086,influxdb.database=metrics,influxdb.username=spark,influxdb.password=spark'

liangchen917 commented 4 years ago

I have a similar issue, could you please share the configuration that worked for you?

--conf spark.jars=/opt/jvm-profiler-1.0.0.jar --conf spark.driver.extraJavaOptions='-javaagent:/opt/jvm-profiler-1.0.0.jar=sampleInterval=100,reporter=com.uber.profiling.reporters.InfluxDBOutputReporter,influxdb.host=101.12.72.101,influxdb.port=8086,influxdb.database=metrics,influxdb.username=spark,influxdb.password=spark'

foxinmy commented 4 years ago

@pedro93 @liangchen917 spark.jars and spark.yarn.jars can't be used at the same time you can see logging: WARN Client: Same path resource hdfs://hdfs_url/jarsjvm-profiler-1.0.0.jar added multiple times to distributed cache.