Segence / docker-hadoop

A Docker container with a full Hadoop cluster setup with Spark and Zeppelin
63 stars 40 forks source link

Spark WebUI is not responding #1

Closed marianormuro closed 7 years ago

marianormuro commented 7 years ago

The spark UI is not running after following the instructions for setting up a local hadoop cluster (not stand alone). All other UIs are up, HDFS is running.

robvadai commented 7 years ago

Hi @marianormuro,

Spark is using YARN in this set up which means that a standalone Spark server is not automatically started. If you start a Spark job, either through submitting a JAR file to YARN or opening a Spark Shell session (as described under the Running a sample interactive Spark job section in the README) you should be able to get to the Spark UI on this URL: http://localhost:4040/jobs/

Please let me know how it goes.