Graylog Extended Log Format (GELF) implementation in Java for all major logging frameworks: log4j, log4j2, java.util.logging, logback, JBossAS7 and WildFly 8-12
I'm facing a problem when runnning Spark-scala Application, when I am trying to submit my application its showing this error messgae.
ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: biz.paluch.logging.gelf.log4j.GelfLogAppender.setAdditionalFieldTypes(Ljava/lang/String;)V
java.lang.NoSuchMethodError: biz.paluch.logging.gelf.log4j.GelfLogAppender.setAdditionalFieldTypes(Ljava/lang/String;)V
I am using this Jar in build.sbt file.
"biz.paluch.logging" % "logstash-gelf" % "1.12.0"
I tried changing the jar version but in that case it showing below error massage while creating jar.
error] CustomLogger.scala:30: value setAdditionalFieldTypes is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender
[error] appender.setAdditionalFieldTypes("string")
[error] ^
[error] CustomLogger.scala:32: value setAdditionalFieldTypes is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender
[error] appender.setAdditionalFieldTypes("string")
[error] ^
[error] CustomLogger.scala:35: value setIncludeFullMdc is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender
[error] appender.setIncludeFullMdc(true)
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
Can someone kindly suggest.
These are the other jars which I am using for my applications.
class StreamingLens_POC(spark: SparkSession, options: RequestBuilder)
object StreamingLens_POC {
def main(args: Array[String]): Unit = {
val applicationName = args(0)
val spark = SparkSession
.builder()
.appName(applicationName)
//.config("spark.master", "local") //Addition code to execute in local
.getOrCreate()
val streamingLens = new StreamingLens_POC(spark, options) // added this new line for StreamingLense
//..... Remaining Code..
..
..
..
..
}
Hi!
I'm facing a problem when runnning Spark-scala Application, when I am trying to submit my application its showing this error messgae.
ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: biz.paluch.logging.gelf.log4j.GelfLogAppender.setAdditionalFieldTypes(Ljava/lang/String;)V java.lang.NoSuchMethodError: biz.paluch.logging.gelf.log4j.GelfLogAppender.setAdditionalFieldTypes(Ljava/lang/String;)V
I am using this Jar in build.sbt file.
"biz.paluch.logging" % "logstash-gelf" % "1.12.0"
I tried changing the jar version but in that case it showing below error massage while creating jar.
error] CustomLogger.scala:30: value setAdditionalFieldTypes is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender [error] appender.setAdditionalFieldTypes("string") [error] ^ [error] CustomLogger.scala:32: value setAdditionalFieldTypes is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender [error] appender.setAdditionalFieldTypes("string") [error] ^ [error] CustomLogger.scala:35: value setIncludeFullMdc is not a member of biz.paluch.logging.gelf.log4j.GelfLogAppender [error] appender.setIncludeFullMdc(true) [error] ^ [error] three errors found [error] (compile:compileIncremental) Compilation failed
Can someone kindly suggest.
These are the other jars which I am using for my applications.
"org.apache.spark" % "spark-core_2.11" % "2.2.0", "org.apache.kafka" % "kafka_2.11" % "0.10.2.1", "org.apache.kafka" % "kafka-clients" % "0.10.2.1", "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0", "org.apache.spark" %% "spark-streaming" % "2.2.0", "org.apache.spark" %% "spark-sql" % "2.2.0", "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.2.0", "com.google.code.gson" % "gson" % "2.8.0", "org.apache.commons" % "commons-dbcp2" % "2.0.1", "biz.paluch.logging" % "logstash-gelf" % "1.12.0", "com.github.scopt" %% "scopt" % "3.7.1", "org.apache.ignite" % "ignite-core" % "2.7.0", "org.apache.cxf" % "cxf-core" % "3.1.0", "com.qubole" %% "spark-streaminglens" % "0.5.3"
My Spark-Submit Command:
Code:
class StreamingLens_POC(spark: SparkSession, options: RequestBuilder) object StreamingLens_POC { def main(args: Array[String]): Unit = { val applicationName = args(0) val spark = SparkSession .builder() .appName(applicationName) //.config("spark.master", "local") //Addition code to execute in local .getOrCreate() val streamingLens = new StreamingLens_POC(spark, options) // added this new line for StreamingLense //..... Remaining Code.. .. .. .. .. }