We detected one issue ,when we run Zoo at Databricks cluster ( Standard cluster) , Zoo code always run at single executor
considering the spark.executor.instances been set more than 1 , for example 6
sparkConf = init_spark_conf(conf={"spark.executorEnv.TF_DISABLE_MKL": "1"})
sc = init_nncontext(sparkConf)
spark = SparkSession \
.builder \
.appName(app_name) \
.getOrCreate()
We detected one issue ,when we run Zoo at Databricks cluster ( Standard cluster) , Zoo code always run at single executor considering the spark.executor.instances been set more than 1 , for example 6 sparkConf = init_spark_conf(conf={"spark.executorEnv.TF_DISABLE_MKL": "1"}) sc = init_nncontext(sparkConf) spark = SparkSession \ .builder \ .appName(app_name) \ .getOrCreate()