amplab / spark-ec2

Scripts used to setup a Spark cluster on EC2
Apache License 2.0
391 stars 298 forks source link

how to remain conf upon spark cluster stop/start via spark-ec2 #77

Open biolearning opened 7 years ago

biolearning commented 7 years ago

Some conf change requires cluster restartup to take effect, say 'SPARK_WORKER_OPTS'; but during stop/start spark cluster via spark-ec2, it seems it re-setup cluster and flushes all of conf. So is there a way to keep them during stop/start cluster via spark-ec2? BTW, somehow I can not stop and start cluster via Spark stop-all/start-all scripts.

key2market commented 7 years ago

Same here ./spark-ec2 destroy my-spark-cluster outputs Searching for existing cluster my-spark-cluster in region us-east-1... Are you sure you want to destroy the cluster my-spark-cluster? (y/N) y

even when there is no my-spark-cluster in region us-east-1. In other words, there is no prompt that stop script did not find the relevant cluster in the specific region and it looks like the script is working on something when in fact it is not shutting down anything at all.