gettyimages / docker-spark

Docker build for Apache Spark
MIT License
679 stars 369 forks source link

Spark history server implementation #46

Closed romainx closed 4 years ago

romainx commented 5 years ago

Spark comes with a history server, it provides a great UI with many information regarding Spark jobs execution (event timeline, detail of stages, etc.). Details can be found in the Spark monitoring page.

I've modified the gettyimages/docker-spark to be able to run it with the docker-compose upcommand.

With this implementation, its UI will be running at http://${YOUR_DOCKER_HOST}:18080.

history-server

To use the Spark’s history server you have to tell your Spark driver:

By default the /tmp/spark-events is mounted on the ./spark-events at the root of the repo (I call it $DOCKER_SPARK). So you have to tell the driver to log events in this directory (on your local machine).

This example shows this configuration for a spark-submit (the two --conf options):

DOCKER_SPARK="/Users/xxxx/Git/docker-spark"

$SPARK_HOME/bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master spark://localhost:7077 \
  --conf "spark.eventLog.enabled=true" \
  --conf "spark.eventLog.dir=file:$DOCKER_SPARK/spark-events" \
  $SPARK_HOME/examples/jars/spark-examples_2.11-2.3.1.jar \
  10

Note: This settings can be defined in the driver's $SPARK_HOME/conf/spark-defaults.conf to avoid using the --conf option.

This comment comes from my blog post.

OneCricketeer commented 4 years ago

the log directory to use: spark.eventLog.dir file:/tmp/spark-event

If the drivers and executors are ephemeral, how can you access task logs via the history server?