Open bharath12345 opened 10 years ago
Hi @bharath12345 you're right that's not actually covered in the docs. Have you tried to scp your jar into the master container (see instructions on ssh login) and run it from there? I believe Spark should be installed inside /opt.
I'm afraid there is no way to directory deploy with sbt. However, you could use the data dir option when you start the cluster to attach a directory that you then deploy your jar to. You would still need to start it from the command line by ssh-ing into the master I guess.
Am a Spark and Docker noob and this is actually a question and not an issue.
I followed your instructions and was able to setup the cluster and run the example. This is what I see as my cluster status -
I have written a Spark program for Linear Regression which runs perfectly in the local mode. It is a very small program and on github here
Now, I want to run this program on my spark cluster. The instructions in the Spark programming guide leave me scratching my head about what to do next. Want your help to know what is the right way to run the application -
If this has been explained elsewhere then please point me as I could not find any example on how to run an application program on a spark cluster.
Thank you very much.