scala-js / scala-js

Scala.js, the Scala to JavaScript compiler
https://www.scala-js.org/
Apache License 2.0
4.59k stars 395 forks source link

Referring to non-existent class #629

Closed mseddon closed 10 years ago

mseddon commented 10 years ago

I have a scalajs project that attempts to mix both scalaJs and scalaJvm (based on spray-can), using scalajs 4.3. For fun, I tried to add "org.scalaz" %% "scalaz-core" % "7.0.6" to the scalajs libraryDependencies. While the compilation step passes fine, preoptimizeJs fires a bunch of warnings along the lines of:

[warn] Referring to non-existent class scalaz_Apply$ [warn] Referring to non-existent class scalaz_Scalaz$ [warn] Referring to non-existent class scalaz_Apply [warn] Referring to non-existent method scalaz_Scalaz$.someOLscala_Option [warn] called from com_example_Test.main__V

This is most likely not a scala-js issue, rather, my bad project configuration, but I am wondering where to begin tracking down the issue. How can it be that scala-js compiles the classes correctly and then loses them when sending to closure compiler? There are certainly no obvious errors being reported.

Incidentally, I ran into a similar problem the other day upgrading a project to use scala-js 0.5-M2 and scala-js-dom 0.5-M2. In this case, no classes from scala-js-dom could be found, but simply restoring the versions for scalajs-sbt-plugin and scala-js-dom back to 0.4.3 solved the issue. At the time I assumed I was perhaps too far on the bleeding edge, but now I'm almost entirely certain I'm not understanding something.

sjrd commented 10 years ago

This behavior is perfectly normal. These warnings are emitted by the Scala.js sbt plugin when it preoptimizes (btw, preoptimize is our own optimizer, not Closure; optimizeJS runs Closure after our own optimizer) and it detects that your code refers to classes for which no JavaScript code could be found. The compilation (compile) does not check the existence of JavaScript files: it uses .class files to compile - that is why it was able to compile your classes correctly.

The problem is: you can't just depend on the binaries of scalaz compiled for the JVM and expect Scala.js to be able to use that. You need to depend on a version of scalaz compiled for Scala.js (which, AFAIK, does not exist yet). Your config, as it is, gives the ability to the compiler to produce JavaScript code for your application, using the types and classes it finds in scalaz-core.jar. But that does not produce JavaScript code for Scalaz itself. So preoptimize tells you that (because it really looks for the JavaScript files).

If you had the same problem with dom and 0.5.0-M2, that is probably because either 1) you did not use scalajs-dom 0.5-SNAPSHOT (which is the first version compiled for 0.5.0-M2) or 2) you forgot to use %%% instead of %% (and it unfortunately found the remnants of a version compiled for 0.4.x, which is not compatible with 0.5.0-M2).

mseddon commented 10 years ago

Aha, yes that's completely obvious now, thanks!

I think I will defer on trying to port scalaz, but it seems now would be a good time to experiment setting up an sbt project to allow me to cross compile and publish my own pure scala libaries to my private repo so I can depend on them in either jvm or scala-js projects. I think I understand what I need to do, but I'll outline it here, in case I've missed anything critical (I'm still grappling with sbt, so apologies if this seems trivial):

For simplicity, let us assume for now I just want to publish a library containing a single scala object. There is no scalajs or jvm dependent code. I would create a project "myTestLib", which would contain the source for this object.

I would then have to define two projects, myTestLibJvm and myTestLibJs, who both depend on myTestLib, although they are empty in this case. These will cause the appropriate class/js files to be produced.

I would aggregate all these projects into a single root project for packaging, and also have to addSbtPlugin for scala-js in my toplevel project/plugins.sbt.

Should the above work? Assuming it does, it seems like a sensible setup.

I also think maybe it's too complex in the case that there is literally no dependent code. Is there simpler way to just get sbt to compile and publish both a scala-js and jvm version of the library, in a build file containing just one project?

It is not entirely clear to me what causes scalajs to compile, say, the js component of scala-js-pickling, and not the jvm part. Is it simply the presence of scalaJSSettings in the project? And if so, does this mean the project is now only a scala-js target, or have I produced both scala-js output and jvm output?

gzm0 commented 10 years ago

Your setup would almost work: If you depend on a project, you depend on its binaries, not on its sources. However, to cross compile JVM and Scala.js, you want to depend on the source files. AFAIK, you have to do this manually (unmanagedSources in JVMProject ++= unmanagedSources.in(CoreProject).value)

There is currently no simpler way to do that (and there probably won't).

Further: Yes, it is simply the scalaJSSettings that cause Scala.js to compile. They do everything from depending and adding the compiler plugin to changing the run and test tasks to use a JavaScript virtual machine, etc.

If you add the scalaJSSettings to your project, the compilation will still produce .class files, but you must not use these since they contain strange stuff (mainly due to exports). The compiler uses them for symbol lookup when doing separate compilation. So, no, once you add the scalaJSSettings, your project is a Scala.js target only.

As for Scalaz, it's probably sufficient to add the Scala.js plugin and settings to make it work. It worked for shapeless (see #599).

mseddon commented 10 years ago

Fantastic, thanks so much for clearing things up!

I'm a little confused about adding the unmanagedSources.in(CoreProject).value to the JVMProject though- I would assume I would want to add that to (unmanagedSources in JSProject), because it is the scala-js project that would need it? Can you show me some context where this would be in a simple sbt build?

Thanks so much for your help, everyone, it's really appreciated.

gzm0 commented 10 years ago

I'll just make an example (SBT 0.13.x):

lazy val root = project.aggregates(jvm, js)

// only contains code, doesn't do anything else. Add shared code here
lazy val core = project 

// JVM project. Add JVM specific sources in its subtree
lazy val jvm = project.settings(
  unmanagedSourceDirectories <++= unmanagedSourceDirectories in core)

// Scala.js project. Add Scala.js specific sources in its subtree
lazy val js = project.settings(scalaJSSettings: _*).settings(
    unmanagedSourceDirectories <++= unmanagedSourceDirectories in core)

That should be more or less a minimal example.

mseddon commented 10 years ago

That's perfect. I realise now, that the unmanagedSourceDirectories is just how we're sharing the core code with the other projects (SBT newbie woes). I think I have everything I need now. I'm extremely grateful for you both taking the time to explain this.

mseddon commented 10 years ago

Whoops, sorry to bother you again, just one last question w.r.t the example you gave, aggregates was obviously a typo of aggregate, but trying it again I am getting some scope issues. I have fixed one half of the problem by doing unmanagedSourceDirectories in Compile <++= ..., but it's still barfing on the unmanagedSourceDirectories in core right hand side:

[error]   core/*:unmanagedSourceDirectories from js/compile:unmanagedSourceDirectories(C:\scala-eclipse\workspace\test-lib\build.sbt:12)
[error]      Did you mean core/compile:unmanagedSourceDirectories ?
[error]
[error]   core/*:unmanagedSourceDirectories from jvm/compile:unmanagedSourceDirectories(C:\scala-eclipse\workspace\test-lib\build.sbt:8)
[error]      Did you mean core/compile:unmanagedSourceDirectories ?

My adapted sbt file:

lazy val root = project.aggregate(jvm, js)

// only contains code, doesn't do anything else. Add shared code here
lazy val core = project 

// JVM project. Add JVM specific sources in its subtree
lazy val jvm = project.settings(
  unmanagedSourceDirectories in Compile <++= unmanagedSourceDirectories in core)

// Scala.js project. Add Scala.js specific sources in its subtree
lazy val js = project.settings(scalaJSSettings: _*).settings(
    unmanagedSourceDirectories in Compile <++= unmanagedSourceDirectories in core)

I've "fixed" this further, by doing:

lazy val root = project.aggregate(jvm, js)

// only contains code, doesn't do anything else. Add shared code here
lazy val core = project 

// JVM project. Add JVM specific sources in its subtree
lazy val jvm = project.settings(
  unmanagedSourceDirectories in Compile <++= (unmanagedSourceDirectories in core) in Compile)

// Scala.js project. Add Scala.js specific sources in its subtree
lazy val js = project.settings(scalaJSSettings: _*).settings(
    unmanagedSourceDirectories in Compile <++= (unmanagedSourceDirectories in core) in Compile)

Am I doing the right thing, and would you expect that double nesting of in's?

sjrd commented 10 years ago

Yes, you did exactly the right thing. There's nothing wrong with the double in's, although you can write it in a nicer way like this:

lazy val jvm = project.settings(
  unmanagedSourceDirectories in Compile <++= unmanagedSourceDirectories in (core, Compile)

(note in (core, Compile))

mseddon commented 10 years ago

Great, thanks again! I was sure there was at least some more sensible syntax for this case. Closing this, finally.