Open colindean opened 2 years ago
You can see a build begin to fail here. Note the inclusion of scopt in the commonSettings
declared earlier in the build.sbt
.
I'll also note that this setup has absolutely wrecked the IntelliJ project. IntelliJ appears to be importing the primary project that is erroring. This is also the first time I've used project matrix with IntelliJ so pointers there are welcome, too.
Your build has 3 jvmPlatform(...)
rows. I don't think you can do that since a row needs to have a unique virtual axis, which can represent JVM, JS, custom-axis-for-libraries etc. The Scala versions are columns so to speak.
Please try consolidating your rows into one and do the usual pattern matching on scalaBinaryVersion within it, and see if that fixes your issues.
Playing with this in a very small window of time just now…
I wanted to do something like this, but scalaBinaryVersion.value
isn't available where I've tried to use it:
.jvmPlatform(
scalaVersions = Seq(scala211, scala212, scala213),
settings = scalaBinaryVersion.value match {
case "2.11" =>
Seq(
circeVersion := "0.11.2",
circeYamlVersion := "0.10.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.4" % Provided,
(Compile / runMain) := Defaults.runMainTask(Compile / fullClasspath, Compile / run / runner).evaluated,
generateTestData := { (Compile / runMain).toTask(" com.target.data_validator.GenTestData").value }
)
case "2.12" =>
Seq(
circeVersion := "0.14.2",
circeYamlVersion := "0.14.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.8" % Provided
)
case "2.13" =>
Seq(
circeVersion := "0.14.2",
circeYamlVersion := "0.14.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.2.1" % Provided
)
}
)
I got sbt
to run with this, but it has the same problem as my previous solution:
lazy val root = (projectMatrix in file("."))
.enablePlugins(BuildInfoPlugin)
.settings(commonSettings)
.jvmPlatform(
scalaVersions = Seq(scala211, scala212, scala213),
settings = Seq(
circeVersion := (scalaBinaryVersion.value match {
case "2.11" => "0.11.2"
case "2.12" | "2.13" => "0.14.2"
}),
circeYamlVersion := (scalaBinaryVersion.value match {
case "2.11" => "0.10.1"
case "2.12" | "2.13" => "0.14.1"
}),
libraryDependencies ++= (scalaBinaryVersion.value match {
case "2.11" => Seq("org.apache.spark" %% "spark-sql" % "2.3.4" % Provided)
case "2.12" => Seq("org.apache.spark" %% "spark-sql" % "2.4.8" % Provided)
case "2.13" => Seq("org.apache.spark" %% "spark-sql" % "3.2.1" % Provided)
}),
//(Compile / runMain) := Defaults.runMainTask(Compile / fullClasspath, Compile / run / runner).evaluated,
//generateTestData := { (Compile / runMain).toTask(" com.target.data_validator.GenTestData").value }
)
)
I feel like I'm getting warmer…
I aim to build a Spark 2.x app for Scalas 2.11, 2.12, and 2.13 with a set of dependencies for each on top of the base dependencies.
My configuration seems to allow the projects for 2.11 and 2.12 to build and test (e.g.
sbt root2_11/test
) correctly while the 2.13 has some errors still (working on those). However, there seems to be another project that's also building, and it can't find any dependencies, so it errors when I runsbt test
as CI does.I think what's happening is that the root project still thinks it should compile when I think I only want the projects in the specified matrix
jvmPlatform
declarations to be active. I think I need to disable this root project somehow, but I can't find a way to do that.I'd welcome some pointers in the right direction. I'm so close to getting this cross-version build to work!