Open marcgrue opened 2 years ago
Workaround by using platform-specific imports:
// js
import org.scalajs.macrotaskexecutor.MacrotaskExecutor
import scala.concurrent.ExecutionContext
trait MacroExecutorImpl {
def globalImpl: ExecutionContext = MacrotaskExecutor.Implicits.global
}
// jvm
import scala.concurrent.ExecutionContext
trait MacroExecutorImpl {
def globalImpl: ExecutionContext = ExecutionContext.global
}
// shared
object MacroExecutor extends MacroExecutorImpl {
implicit def global: ExecutionContext = globalImpl
}
// shared use
import path.to.your.MacroExecutor._
Thanks for reporting! Having cross-compiled lots of libraries for JVM/JS I completely understand the nuisance this creates.
However, shimming the MacrotaskExecutor
for JVM ... would be really weird and even confusing. The concept of "macrotasks" and "microtasks" is unique to JS.
A slight rephrasing of this issue could be:
Use the
MacrotaskExecutor
as the default implementation forscala.concurrent.ExecutionContext.global
in Scala.js.
scala.concurrent.ExecutionContext.global
was a JVM/JS cross-platform way to get an ExecutionContext
. Except, that import is now deprecated as of Scala.js 1.8.0.
The explanation for why the MacrotaskExecutor
cannot be used as the implementation for scala.concurrent.ExecutionContext.global
is given here:
http://www.scala-js.org/news/2021/12/10/announcing-scalajs-1.8.0/
Thanks for the explanation! When coding shared code though, one needs to have both implementations anyway somehow and it would be nice without workarounds.
Maybe the subject of micro/macrotasks are not that important from the user perspective and more of an implementation consideration? Would the weird/confusing thing of a shared api be that setImmediate
and setTimeout
are not relevant on the jvm?
When coding shared code though, one needs to have both implementations anyway somehow and it would be nice without workarounds.
💯 agree. I think in an ideal world this would live in scala.concurrent.ExecutionContext.global
since that is a standard API that exists across all Scala platforms. In a less-than-ideal world, someone can create a cross-platform library e.g. "scala-execution-contexts" that provides good defaults.
However, it is almost definitely out-of-scope for this project to do that. For example, what if someone opens an issue asking us to also shim this for Scala Native? Should we do that too, why/why not?
Scala Native is a good point. Perhaps this is a job for https://github.com/portable-scala, actually.
Exactly my thinking! :)
How many shims are we potentially talking about?
My side on this is that importing a specific ExecutionContext
in library code is usually not ideal because you remove the flexibility from providing your own, in my case, I write the shared code to depend in the ExecutionContext
just like the Future
API, this way, you can hook up the correct ExecutionContext
in the application's entry point (macrotask-executor in js, something else in jvm/native).
Example:
// shared code
class MyService(implicit ec: ExecutionContext) {...}
// js
import org.scalajs.macrotaskexecutor.MacrotaskExecutor.Implicits._
class JsMain {
val service = new MyService
}
// jvm
import scala.concurrent.ExecutionContext.Implicits.global
class JvmMain {
val service = new MyService
}
Crosscompiling the macrotask-executor to both js/jvm would allow to use it in shared code without having to make platform-specific imports. Possibly by using a standard/supplied ExecutionContext shim on the jvm side.