mp911de / logstash-gelf

Graylog Extended Log Format (GELF) implementation in Java for all major logging frameworks: log4j, log4j2, java.util.logging, logback, JBossAS7 and WildFly 8-12
http://logging.paluch.biz
MIT License
425 stars 102 forks source link

java.lang.NoSuchMethodError: biz.paluch.logging.gelf.log4j.GelfLogAppender.setAdditionalFieldTypes(Ljava/lang/String;) #274

Closed rpatid10 closed 3 years ago

rpatid10 commented 3 years ago

Hi!

I'm facing a problem when runnning Spark-scala Application, when I am trying to submit my application its showing this error messgae.

ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: biz.paluch.logging.gelf.log4j.GelfLogAppender.setAdditionalFieldTypes(Ljava/lang/String;)V java.lang.NoSuchMethodError: biz.paluch.logging.gelf.log4j.GelfLogAppender.setAdditionalFieldTypes(Ljava/lang/String;)V

I am using this Jar in build.sbt file.

"biz.paluch.logging" % "logstash-gelf" % "1.12.0"

I tried changing the jar version but in that case it showing below error massage while creating jar.

error] CustomLogger.scala:30: value setAdditionalFieldTypes is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender [error] appender.setAdditionalFieldTypes("string") [error] ^ [error] CustomLogger.scala:32: value setAdditionalFieldTypes is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender [error] appender.setAdditionalFieldTypes("string") [error] ^ [error] CustomLogger.scala:35: value setIncludeFullMdc is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender [error] appender.setIncludeFullMdc(true) [error] ^ [error] three errors found [error] (compile:compileIncremental) Compilation failed

Can someone kindly suggest.

These are the other jars which I am using for my applications.

"org.apache.spark" % "spark-core_2.11" % "2.2.0", "org.apache.kafka" % "kafka_2.11" % "0.10.2.1", "org.apache.kafka" % "kafka-clients" % "0.10.2.1", "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0", "org.apache.spark" %% "spark-streaming" % "2.2.0", "org.apache.spark" %% "spark-sql" % "2.2.0", "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.2.0", "com.google.code.gson" % "gson" % "2.8.0", "org.apache.commons" % "commons-dbcp2" % "2.0.1", "biz.paluch.logging" % "logstash-gelf" % "1.12.0", "com.github.scopt" %% "scopt" % "3.7.1", "org.apache.ignite" % "ignite-core" % "2.7.0", "org.apache.cxf" % "cxf-core" % "3.1.0", "com.qubole" %% "spark-streaminglens" % "0.5.3"

My Spark-Submit Command:

 spark-submit \
--name SPARK_STREAMING_POC \
--num-executors 1 \
--jars  /home/username/jar/spark-streaminglens_2.11-0.5.3.jar , .(other required jar) \
--master yarn \
--deploy-mode cluster \
 --driver-cores 1 --driver-memory 2G --executor-cores 1 --executor-memory 2G \
 --supervise --class com.pkg.data.StreamingLens_POC \
 /home/username/jar/PrjectJarName.jar \
 SPARK_STREAMING_POC

Code:

class StreamingLens_POC(spark: SparkSession, options: RequestBuilder) object StreamingLens_POC { def main(args: Array[String]): Unit = { val applicationName = args(0) val spark = SparkSession .builder() .appName(applicationName) //.config("spark.master", "local") //Addition code to execute in local .getOrCreate() val streamingLens = new StreamingLens_POC(spark, options) // added this new line for StreamingLense //..... Remaining Code.. .. .. .. .. }

mp911de commented 3 years ago

setAdditionalFieldTypes(String) is part of the code since 6 years. I'm not proficient with Scala so I cannot help here.