Closed vincenzobaz closed 1 year ago
Looks good but the CI needs some care:
[error] Not a valid key: examples (similar: compilers, name, compile) [error] examples / runMain rdd.wordcount [error] ^ Error: Process completed with exit code 1.
Do you know if there is a way to set a default project for runMain
? I would like to avoid having to change the github action yaml for each new version
Do you know if there is a way to set a default project for runMain? I would like to avoid having to change the github action yaml for each new version
The default project is alwasy the root project and you cannot really change that. You can do run project example_spark350
in an sbt shell to set it at the current project.
Alternatively you can create a command alias. Maybe one that reads the project in an env var and run the main on it.
You can do run project example_spark350
even this seems hidden now:
sbt:spark-scala3> projects
[info] In file:/home/vinz/Desktop/spark-scala3/
[info] proj_spark3323
[info] proj_spark3503
[info] * spark-scala3
It looks like now sbt only sees root :/
This happens if I do projectMatrix in
inside my function, otherwise I still have access to them
It looks like now sbt only sees root :/
It seems sbt assumes that all your projects are top-level statement. projectMatrix
is a macro def so it somehow inline things to the top level.
Either you should define sparkVersionMatrix
as a macro def (wich does not seem to be a good idea) or you should get rid of it and inline everything manually.
Addresses the points raised in #42 concerning supporting multiple spark versions.
Only the latest spark version will be published, but testing and running examples will happen with all versions