In the source code spark.py, the following section starts the spark service with serveral scripts from /envs/spark/spark/sbin/. However, I can't find the folder /envs/spark/spark/sbin anywhere. I suppose a dedicated Spark binary should be there to communciate with the DRL-based scheduler by the ipc message. Could you provide some suggestions on how Spark is adjusted to work with park?
Dear Author
In the source code spark.py, the following section starts the spark service with serveral scripts from /envs/spark/spark/sbin/. However, I can't find the folder /envs/spark/spark/sbin anywhere. I suppose a dedicated Spark binary should be there to communciate with the DRL-based scheduler by the ipc message. Could you provide some suggestions on how Spark is adjusted to work with park?
os.system("ps aux | grep -ie spark-tpch | awk '{print $2}' | xargs kill -9") os.system(park_path + '/envs/spark/spark/sbin/stop-master.sh') os.system(park_path + '/envs/spark/spark/sbin/stop-slaves.sh') os.system(park_path + '/envs/spark/spark/sbin/stop-shuffle-service.sh') os.system(park_path + '/envs/spark/spark/sbin/start-master.sh') os.system(park_path + '/envs/spark/spark/sbin/start-slave.sh') os.system(park_path + '/envs/spark/spark/sbin/start-shuffle-service.sh')