Right now running spark_session calls spark.session and results in lots of messages when running locally:
> spark_session()
Spark not found in SPARK_HOME:
spark-2.4.5 for Hadoop 2.7 found, setting SPARK_HOME to /Users/dan.zafar/Library/Caches/spark/spark-2.4.5-bin-hadoop2.7
Launching java with spark-submit command /Users/dan.zafar/Library/Caches/spark/spark-2.4.5-bin-hadoop2.7/bin/spark-submit sparkr-shell /var/folders/ng/lwsgjtj52wx9kxct335y7_3h0000gq/T//Rtmp712l77/backend_port165c725f80bb1
20/03/08 19:24:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Java ref type org.apache.spark.sql.SparkSession id 1
Right now running
spark_session
callsspark.session
and results in lots of messages when running locally:Can we clean those up or rebuild the function?