muuki88 / sbt-native-packager-examples

A set of sbt-native-pakager examples
Apache License 2.0
236 stars 55 forks source link

Override environment-specific settings when root module isn't container of source code #14

Closed evbo closed 5 years ago

evbo commented 5 years ago

Following your documented example for overriding settings by using sub-modules, how can this be adapted for a slightly more complex project topology, like:

lazy val app = project
  .in(file("mainProject"))
  .settings(
    name := "my-app",
    libraryDependencies += "com.typesafe" % "config" % "1.3.0"
  )

lazy val stagePackage = project
  .in(file("build/stage"))
  .enablePlugins(JavaAppPackaging)
  .settings(
    resourceDirectory in Compile := (resourceDirectory in (app, Compile)).value,
    mappings in Universal += {
      ((resourceDirectory in Compile).value / "stage.conf") -> "conf/application.conf"
    }
  )
  .dependsOn(app)

I'm guessing because your example contains src/main in the root directory, any sub project is able to find the main class. But I am getting: [warn] You have no main class in your project. No start script will be generated.

I think this is due to the fact that my source code is in: mainProject/src/main/. This is because, in addition to mainProject, I have a scalajs project as well. So is there a way to create submodules not depending on root?

When I print: println((scalaSource in Compile).value.absolutePath) it is:

/me/git/test-service/build/stage/src/main/scala

but needs to be:

/me/git/test-service/mainProject/src/main/scala
muuki88 commented 5 years ago

Thanks for the detailed report :heart:

The example doesn't work! You haven't missed anything. The issue is that Compile / mainClass setting from the app project doesn't propagate to the sub modules as the dependsOn is only a classpath dependency.

I fixed the example in 026b7fea1d659d0077b70dae51d404b449bc61ed

evbo commented 5 years ago

@muuki88 thanks for getting back to me so soon. Maybe this is just my use case, but I think this might be the most important override in ensuring the project's package is included in the subproject. Do you agree?:

mappings in Universal := mappings.in(app, Universal).value

even after adding mainclass that it still doesn't compile a package in the subproject for me.

evbo commented 5 years ago

Also, another caveat is that for some reason setting:

mainClass in Compile := mainClass.in(app, Compile).value

will also cause the artifactPath at runtime to be set to the app's artifact. But, in order to get the overridden changes to resources, etc. we want to run the "dev" artifact. Any idea why setting mainClass to the root project causes the artifact of app to be used instead of the intended sub module's artifact? Any idea how to override which artifact is run by sbt?

evbo commented 5 years ago

@muuki88 I think I've figured out a work around that enables both packaging the submodule with universal (and getting a complete root project artifact with overrides) as well as during compile for running the submodule overrides locally:

    // as you pointed out, add the main class so we run the `app` main from submodule project
    mainClass in Compile := mainClass.in(app, Compile).value,

    // prepend the submodule jar to the classpath, this way the submodule - including all overrides - is used for running locally
    fullClasspath in reStart := {
      Attributed.blank((artifactPath in (Compile, packageBin)).value) +: (fullClasspath in reStart).value
    },

    // for classes to be included in the compiled sub module jar for running locally, add the `app` classes as resources (hack)
    resourceDirectory in Compile := target.in(app).value / s"scala-${scalaBinaryVersion.value}" / "classes",

    // for universal packaging of the submodule to be runnable, get all the mappings of the `app`
    mappings in Universal := mappings.in(app, Universal).value
muuki88 commented 5 years ago

Also, another caveat is that for some reason setting: ... will also cause the artifactPath at runtime to be set to the app's artifact

What do you mean by that?

Thanks for sharing your current solution. I wonder why you need to configure reStart as well. The example was meant to build different packages for different environments. Locally it should always be dev?

I also updated the example to make it actually run.

evbo commented 5 years ago

In my particular example, I am using scalajs and when run locally my submodule needs to generate resources by calling fastopts on that other scalajs project. So, unless I'm missing something obvious, the way to handle that was to have a "dev" submodule that calls fastops of the scalajs project, whereas my root project always builds using fullopts

As for the artifactPath, if you print the classpath: println((fullClasspath in reStart).value.mkString("\n"))

for whatever reason it contains the app jar, not the submodule jar. Since the submodule jar has all the overrides, that is why I needed to prepend its jar to the classpath.

evbo commented 5 years ago

@muuki88 I'm not sure if this is still an appropriate place for this discussion, but for what it's worth by using exportJars := true I'm able to simplify my submodule down to just:

lazy val stageProject= (project in file("app/build/stage"))
  .enablePlugins(JavaAppPackaging)
  .dependsOn(app)
  .settings(
    // all that's needed for running locally:
    mainClass in Compile := mainClass.in(server, Compile).value,
    exportJars := true

    // to package a jar with dependency archive:
    mappings in Universal := mappings.in(app, Universal).value
  )

It's well documented exporting jars will add them to the classpath - obviating the need for manually tweaking it like before

The weird thing is that according to sbt, im running my application with no classes! The jar it creates only contains resource files, no classes! It seems that the classes are loaded from the app target classes directory, which I guess is fine since my submodule never introduces code changes:

Attributed(/me/Downloads/app/app/build/stage/target/scala-2.12/appStage_2.12-0.1.0-SNAPSHOT.jar)
Attributed(/me/Downloads/app/app/target/scala-2.12/classes)
... other .ivy dependencies....
muuki88 commented 5 years ago

Thanks for sharing @evbo and sorry for the late reply.

The weird thing is that according to sbt, im running my application with no classes! The jar it creates only contains resource files, no classes! It seems that the classes are loaded from the app target classes directory

That is the intended effect! The stage, prod and test modules are merely meta modules that customize some of your build configs. You are always running app, but with different configs.

Personally I think, that this pattern is bad practice. Rather your application should come with different *.conf files and you can start you application with properly. However you said that ScalaJS is involved, which probably doesn't work well with Typesafe Config.

evbo commented 5 years ago

how would that work with different *.conf files? I use typesafe config for runtime configuration (e.g. for my tests or production jar to access constants, which are usually set from environment variables).

But in my case, I need to run different build processes (namely fast or fully optimized scalajs compilation) depending on whether I'm running the server code locally or building a production jar. I'm not sure how typesafe would hook into sbt to control which build process gets run.

scalajs has the ability to configure settings per task. So you can easily configure things like: when compiling fast optimization - use these resources or add the following to the universal mappings, etc.

But when my server (scala code) build needs production javascript, it isn't a scalajs project and doesn't recognize the fastOptJS task, so instead of always hard coding it like is often done in examples online, I allow two subProjects - dev and prod, which either compile the fast optimized or fully optimized js.

Does that make sense? I appreciate you taking the time to give extended feedback like this. Always good to hear about better ways of avoiding pitfalls.