vincenzobaz / spark-scala3

Apache License 2.0
89 stars 15 forks source link

Spark versions matrix #43

Closed vincenzobaz closed 1 year ago

vincenzobaz commented 1 year ago

Addresses the points raised in #42 concerning supporting multiple spark versions.

Only the latest spark version will be published, but testing and running examples will happen with all versions

vincenzobaz commented 1 year ago

Looks good but the CI needs some care:

[error] Not a valid key: examples (similar: compilers, name, compile)
[error] examples / runMain rdd.wordcount
[error]         ^

Error: Process completed with exit code 1.

Do you know if there is a way to set a default project for runMain? I would like to avoid having to change the github action yaml for each new version

adpi2 commented 1 year ago

Do you know if there is a way to set a default project for runMain? I would like to avoid having to change the github action yaml for each new version

The default project is alwasy the root project and you cannot really change that. You can do run project example_spark350 in an sbt shell to set it at the current project.

Alternatively you can create a command alias. Maybe one that reads the project in an env var and run the main on it.

vincenzobaz commented 1 year ago

You can do run project example_spark350

even this seems hidden now:

sbt:spark-scala3> projects
[info] In file:/home/vinz/Desktop/spark-scala3/
[info]     proj_spark3323
[info]     proj_spark3503
[info]   * spark-scala3

It looks like now sbt only sees root :/

vincenzobaz commented 1 year ago

This happens if I do projectMatrix in inside my function, otherwise I still have access to them

adpi2 commented 1 year ago

It looks like now sbt only sees root :/

It seems sbt assumes that all your projects are top-level statement. projectMatrix is a macro def so it somehow inline things to the top level.

Either you should define sparkVersionMatrix as a macro def (wich does not seem to be a good idea) or you should get rid of it and inline everything manually.