Open menon92 opened 6 years ago
Hi,
Did you modify the example script in any way? Besides changing the master address? Because the script is not intended to be submitted through spark-submit
.
Joeri
Hello @JoeriHermans ,
Thanks for you quick replay. I just add an import statement from pyspark.sql import SparkSession
in mnist.py and changing the master address. nothing more then that.
you said, script is not intended to be submitted through spark-submit
then why this code is running when I use local[*]
as master address.
if the script is not intended to be submitted through spark-submitI
then how can i run this mnist.py
so that it run over two cluster node .
or It is not possible to run in cluster ?
Thanks
I am trying to run mnist.py with standalone cluster mode. for this I set
master = "local[*] to "spark://MY_MASTER_IP:7077"
and I submit my task by following commandbut I get the following error
With
master = "local[*]
it works nice but it use only on worker. I want to use to two of my workerThanks