Open mafeko opened 6 years ago
I think I got a little closer to the solution,
I could find out the correct spark-endpoint in my setup by looking up the configuration of the spark-context (sc):
print(sc.getConf().get('spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES').split(','))
In my case this is a https url, which now seems to be another challenge.
Hey, any more on this? I'm looking at rigging this up with Yarn as well
Hi your tools looks really promising.
currently i get:
{"error": "SPARK_NOT_RUNNING"}
running /spark in my setup.How can i configure the extension to work in an environment like this:
pyspark2 --deploy-mode client --master yarn
Could you give me a hint how to configure the setting in this case?
jupyter notebook --Spark.url="http://localhost:4040"
Thank you.