juju-solutions / layer-apache-spark

Other
2 stars 10 forks source link

Spark HA, removing/adding a master and the effect on established zeppelin context #23

Open andrewdmcleod opened 8 years ago

andrewdmcleod commented 8 years ago

When a zeppelin interpreter is executed it instantiates a spark context with the current environment constants (e.g., $MASTERS). What will happen to an existing spark context used by a zeppelin interpreter if a node goes away or is added? If we need to restart zeppelin / zeppelin interpreter(s), can we trigger this without having an effect on any (long) running zeppelin jobs?

Testing required - deploy a spark cluster, run a LONG RUNNING zeppelin job, destroy/add nodes, watch zeppelin for errors.

Another question is, will the context eventually be recreated (timeout, automatically, etc) if/when interpreter detects difference in spark nodes?