yahoo / CaffeOnSpark

Distributed deep learning on Hadoop and Spark clusters.
Apache License 2.0
1.27k stars 357 forks source link

How can I use more cpus in cpu mode? #258

Open guyang88 opened 7 years ago

guyang88 commented 7 years ago

@junshi15 @anfeng I run caffeonspark on cpu grid(32coresX10) .In spark_on_yarn mode ,how many the number of "executor-cores" should I set? I set the number >1,then I got error . And in spark_standalone mode ,how many the number of "core_per_worker" should set? In the same spark_work_instances ,more executor-cores, can task have higher efficiency?

junshi15 commented 7 years ago

in YARN mode, spark.executor.cores = 1. https://spark.apache.org/docs/latest/configuration.html