Open biolearning opened 7 years ago
Same here
./spark-ec2 destroy my-spark-cluster
outputs
Searching for existing cluster my-spark-cluster in region us-east-1... Are you sure you want to destroy the cluster my-spark-cluster? (y/N) y
even when there is no my-spark-cluster in region us-east-1. In other words, there is no prompt that stop script did not find the relevant cluster in the specific region and it looks like the script is working on something when in fact it is not shutting down anything at all.
Some conf change requires cluster restartup to take effect, say 'SPARK_WORKER_OPTS'; but during stop/start spark cluster via spark-ec2, it seems it re-setup cluster and flushes all of conf. So is there a way to keep them during stop/start cluster via spark-ec2? BTW, somehow I can not stop and start cluster via Spark stop-all/start-all scripts.