Closed lolaclinton closed 7 years ago
@lolaclinton I'm not sure what you mean? Spark version of your program could be defined by
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.1.1" % "provided"
)
But since it's provided
, it will use the Spark provided by EMR when the job is submitted, the Spark provided by EMR is determined by EMR's release label, which can be changed by
sparkEmrRelease := "emr-5.5.0"
in build.sbt
.
Is it possible to have both a local spark version and an EMR version in the same SBT file?