REST job server for Spark. Note that this is *not* the mainline open source version. For that, go to https://github.com/spark-jobserver/spark-jobserver. This fork now serves as a semi-private repo for Ooyala.
Is it possible to pass in custom SparkConf values (i.e not set globally across the entire job-server), such as within the job that extends the SparkJob trait, or in the POST /contexts ?
The README.md implies that the POST /contexts always assumes spark.xxxx settings.
An example of a standalone spark job might be set up as follows, with some custom hadoop configs (like cassandara hadoop, elasticsearch, etc)
val conf = new SparkConf()
.setAppName("test")
.setMaster(sparkMaster)
conf.set("es.nodes", "10.1.1.1")
val sc = new SparkContext(conf)
or
val conf = new SparkConf().set(“cassandra.connection.host”, “localhost”)
Is it possible to pass in custom SparkConf values (i.e not set globally across the entire job-server), such as within the job that extends the SparkJob trait, or in the POST /contexts ? The README.md implies that the
POST /contexts
always assumes spark.xxxx settings. An example of a standalone spark job might be set up as follows, with some custom hadoop configs (like cassandara hadoop, elasticsearch, etc)or