Closed Au-Miner closed 1 day ago
I have found that modifying any code in Scala through the above process cannot properly affect the execution of Spark shell. I am certain that my jar package and other settings are correct
Even if I modify BlazeSparkSessionExtension to the following content, it still cannot affect the execution of the spark shell. I am certain that the jar package has been updated
class BlazeSparkSessionExtension extends (SparkSessionExtensions => Unit) with Logging {
Shims.get.initExtension()
override def apply(extensions: SparkSessionExtensions): Unit = {
return
}
}
scala logs are configured by conf/log4j2.properties, you can use a minimal conf:
rootLogger.level = info
rootLogger.appenderRef.stdout.ref = console
appender.console.type = Console
appender.console.name = console
appender.console.target = SYSTEM_ERR
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n%ex
scala logs are configured by conf/log4j2.properties, you can use a minimal conf:
rootLogger.level = info rootLogger.appenderRef.stdout.ref = console appender.console.type = Console appender.console.name = console appender.console.target = SYSTEM_ERR appender.console.layout.type = PatternLayout appender.console.layout.pattern = %d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n%ex
Thank you very much.
Describe the bug Why do I only print Rust logs when I run code on Spark333 and execute SQL commands in Sparkshell? How do I print logs related to Scala?
To Reproduce
3.execute some code
Screenshots
Desktop (please complete the following information):