hashicorp / nomad-spark

DEPRECATED: Apache Spark with native support for Nomad as a scheduler
44 stars 16 forks source link

Add abylity to pass driver Env vars in cluster mode #29

Open tantra35 opened 5 years ago

tantra35 commented 5 years ago

Now this is not prossible, despite mesos,k8s,yarn can do this throw follow params

spark.mesos.driverEnv.<ENVVAR> ----> mesos
spark.kubernetes.driverEnv.<ENVVAR> ---> k8s
spark.yarn.appMasterEnv.<ENVVAR> ---> yarn

With follow little patch we add like params to nomad (spark.nomad.driverEnv.<ENVVAR>)

From 0cc50b06410ee425017f34838bfe0967ddc4358d Mon Sep 17 00:00:00 2001
From: tantra35 <ruslan.usifov@gmail.com>
Date: Thu, 19 Sep 2019 11:58:41 +0300
Subject: [PATCH] add ability to pass driver Env vars in cluster mode

---
 .../org/apache/spark/scheduler/cluster/nomad/DriverTask.scala  | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/resource-managers/nomad/src/main/scala/org/apache/spark/scheduler/cluster/nomad/DriverTask.scala b/resource-managers/nomad/src/main/scala/org/apache/spark/scheduler/cluster/nomad/DriverTask.scala
index b691e75c35..2f24802bf9 100644
--- a/resource-managers/nomad/src/main/scala/org/apache/spark/scheduler/cluster/nomad/DriverTask.scala
+++ b/resource-managers/nomad/src/main/scala/org/apache/spark/scheduler/cluster/nomad/DriverTask.scala
@@ -33,6 +33,7 @@ import org.apache.spark.scheduler.cluster.nomad.SparkNomadJob.{JOB_TEMPLATE, SPA
 import org.apache.spark.util.Utils

 private[spark] object DriverTask extends SparkNomadTaskType("driver", "driver", DRIVER_MEMORY) {
+  private val NOMAD_DRIVER_ENV_KEY = "spark.nomad.driverEnv."

   private val PROPERTIES_NOT_TO_FORWARD = scala.collection.Set(
     // spark: not appropriate/relevant
@@ -69,6 +70,8 @@ private[spark] object DriverTask extends SparkNomadTaskType("driver", "driver",

     super.configure(jobConf, conf, task, ports, "spark-submit")

+    conf.getAllWithPrefix(NOMAD_DRIVER_ENV_KEY).toSeq.foreach((task.addEnv _).tupled)
+
     val additionalJarUrls = Utils.getUserJars(conf).map(asFileIn(jobConf, task))
     if (additionalJarUrls.nonEmpty) {
       conf.set("spark.jars", additionalJarUrls.mkString(","))
-- 
2.23.0.windows.1
OneCricketeer commented 4 years ago

Can this patch be a proper PR?

tantra35 commented 4 years ago

@cricket007 in this issue i think is best approach to get some comments form hashicorp team, and then perhaps if all good make a PR.

OneCricketeer commented 4 years ago

Review comments can be made on PRs as well