REST job server for Spark. Note that this is *not* the mainline open source version. For that, go to https://github.com/spark-jobserver/spark-jobserver. This fork now serves as a semi-private repo for Ooyala.
We tried looking online could not get any good sample code for it. Please provide us with some sample code to pass this data using POST and what how do we get it in runJob .
Hi Guys,
Thanks for providing wonderful job server capabilities over Spark.
We are new to Spark.
We are using Java and we are done experimenting WorkCountExample. Tried below sample and it works fine.
curl -d "input.string = a b c a b see" 'localhost:8090/jobs?appName=test&classPath=spark.jobserver.WordCountExample'
But our basic need is to pass Custom Objects to run Job method and use serialization to consume that object in runJob and trigger real work.
e.g pass this instance of java class Test
Test :
{"menu": { "id": "file", "value": "File", "popup": { "menuitem": [ {"value": "New", "onclick": "CreateNewDoc()"}, {"value": "Open", "onclick": "OpenDoc()"}, {"value": "Close", "onclick": "CloseDoc()"} ] } }}
We tried looking online could not get any good sample code for it. Please provide us with some sample code to pass this data using POST and what how do we get it in runJob .
Regards, Vishal