banzaicloud / spark-metrics

Spark metrics related custom classes and sinks (e.g. Prometheus)
Apache License 2.0
175 stars 64 forks source link

Support Spark 3.0 #42

Closed kangtiann closed 3 years ago

kangtiann commented 4 years ago

Describe the bug

./bin/spark-submit --master spark://KANGTIAN-MB0:7077 --class org.apache.spark.examples.SparkPi     --repositories https://raw.github.com/banzaicloud/spark-metrics/master/maven-repo/releases     --packages com.banzaicloud:spark-metrics_2.11:2.3-2.1.0,io.prometheus:simpleclient:0.3.0,io.prometheus:simpleclient_dropwizard:0.3.0,io.prometheus:simpleclient_pushgateway:0.3.0,io.dropwizard.metrics:metrics-core:3.1.2     ~/Documents/program/spark/examples/jars/spark-examples_2.11-2.4.4.jar  1000

ERROR:

19/11/08 13:22:47 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
    at com.banzaicloud.spark.metrics.sink.PrometheusSink.<init>(PrometheusSink.scala:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.metrics.MetricsSystem.$anonfun$registerSinks$1(MetricsSystem.scala:216)
    at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
    at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
    at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
    at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:196)
    at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:106)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:571)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:896)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:887)
    at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
    at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:901)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:179)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:202)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:89)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:980)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:989)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 32 more

Steps to reproduce the issue:

Expected behavior

Screenshots image

Additional context

stoader commented 4 years ago

@kangtiann Apache Spark 3.0 is not released yet. (it's still in preview). Once a stable version of Apache Spark 3.0 is released we'll release spark-metrics version for it.

kangtiann commented 4 years ago

Oh, get it, thx !

f1yegor commented 4 years ago

idk @kangtiann, we are using it with spark-3.0.0-preview just fine. look at https://github.com/f1yegor/spark-metrics/tree/more_cool_stuff

xutaoustc commented 4 years ago

Looking forward for spark-metrics version for spark 3.0

aminh73 commented 3 years ago

It seems that spark 3.0.1 is released. We are really interested to use this package in spark 3.