hortonworks-spark / spark-atlas-connector

A Spark Atlas connector to track data lineage in Apache Atlas
Apache License 2.0
263 stars 149 forks source link

SAC : Livy Job submission , Spark Session issue for Atlas . Everything works on Spark-shell and Zepplein Interactive session #289

Open nxverma opened 4 years ago

nxverma commented 4 years ago

=============================================================================== 20/01/30 18:58:58 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@ip-172-26-51-55.us-west-2.compute.internal:43495) 20/01/30 18:58:58 INFO YarnAllocator: Will request 7 executor container(s), each with 4 core(s) and 7296 MB memory (including 1152 MB of overhead) 20/01/30 18:58:58 INFO YarnAllocator: Submitted 7 unlocalized container requests. 20/01/30 18:58:58 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals 20/01/30 18:58:58 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0 20/01/30 18:58:58 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done

20/01/30 18:59:21 WARN SparkUtils$: Fail to create Hive Configuration java.lang.IllegalStateException: Cannot find active or default SparkSession in the current context at com.hortonworks.spark.atlas.utils.SparkUtils$.sparkSession(SparkUtils.scala:56) at com.hortonworks.spark.atlas.utils.SparkUtils$.liftedTree1$1(SparkUtils.scala:42) at com.hortonworks.spark.atlas.utils.SparkUtils$.hiveConf(SparkUtils.scala:41) at com.hortonworks.spark.atlas.utils.SparkUtils$$anonfun$2.apply(SparkUtils.scala:68)