Resolves DCOS-54709 - Add Scala 2.12 support for Spark
Updated Makefile, and bin/jenkins-*.sh scripts to support multiple scala versions
Updated manifest.json with new spark distributions with scala 2.12
Default spark variant set to use hadoop-2.9 and scala 2.12
Added a smoke test for custom spark docker images
How were these changes tested?
I wanted to avoid creating test tags in git, so I tested the changes manually without using any production credentials, except uploading new spark distributions to downloads.mesosphere.io domain.
1) Run the following script to build spark distribution archives:
Executed in spark-tools container, as described in the wiki. I had mesosphere/spark and mesosphere/spark-build repositories checked out and mounted next to each other. I've added DIST_DESTINATION_DIR variable to skip upload to S3 and save the archives locally.
2) Uploaded the spark distributions to downloads.mesosphere.io and updated manifest.json
3) Run the following script to build and push docker images to my personal docker hub account
As I mentioned, this is just to test that the script is working. When making an actual release, you should simply create a release tag and run the jenkins job as described in the wiki.
4) Run the smoke test for custom docker images using the instructions from the wiki on how to run integration tests and by passing these variables
What changes were proposed in this pull request?
Resolves DCOS-54709 - Add Scala 2.12 support for Spark
How were these changes tested?
I wanted to avoid creating test tags in git, so I tested the changes manually without using any production credentials, except uploading new spark distributions to downloads.mesosphere.io domain.
1) Run the following script to build spark distribution archives:
Executed in
spark-tools
container, as described in the wiki. I hadmesosphere/spark
andmesosphere/spark-build
repositories checked out and mounted next to each other. I've added DIST_DESTINATION_DIR variable to skip upload to S3 and save the archives locally.2) Uploaded the spark distributions to downloads.mesosphere.io and updated manifest.json
3) Run the following script to build and push docker images to my personal docker hub account
As I mentioned, this is just to test that the script is working. When making an actual release, you should simply create a release tag and run the jenkins job as described in the wiki.
4) Run the smoke test for custom docker images using the instructions from the wiki on how to run integration tests and by passing these variables
TEST_SH_SMOKE_TEST_SPARK_DOCKER_IMAGES='rpalaznik/spark:2.4.3-scala-2.11-hadoop-2.7,rpalaznik/spark:2.4.3-scala-2.11-hadoop-2.9,rpalaznik/spark:2.4.3-scala-2.12-hadoop-2.7,rpalaznik/spark:2.4.3-scala-2.12-hadoop-2.9'
PYTEST_ARGS='-k test_spark_docker_images'
If
TEST_SH_SMOKE_TEST_SPARK_DOCKER_IMAGES
is not set, e.g. during a CI integration test run, the smoke test will be skipped.Release Notes