Open ash211 opened 7 years ago
Kicked off a build at http://spark-k8s-jenkins.pepperdata.org:8080/job/PR-spark-k8s-unit-tests-SBT-TESTING to Spark's ./dev/run-tests
and see what happens -- I know @kimoonkim attempted this earlier and I forget where we ended up.
Question again is: what tests should we run? I think we decided in the original Maven build to only run the things that related to Kubernetes in order to cut down the build time by avoiding building things we don't touch.
Maybe the SBT tests would catch checkstyle issues like unused identifiers: https://github.com/apache-spark-on-k8s/spark/pull/281#discussion_r117361047
We need to enable running SBT tests on the kubernetes module via https://github.com/palantir/spark/pull/191/commits/de3e7727ff3ddb6c07052e3cc187e4bf6b10bc66
Also running ./dev/run-tests
runs ./dev/check-license
which would've caught https://github.com/apache-spark-on-k8s/spark/pull/296
Currently in this repo we're running tests via Maven:
http://spark-k8s-jenkins.pepperdata.org:8080/job/PR-spark-k8s-unit-tests/
./build/mvn clean test -Pmesos -Pyarn -Phadoop-2.7 -Pkubernetes -pl core,resource-managers/kubernetes/core -am -Dtest=none -Dsuffixes='^org\.apache\.spark\.(?!SortShuffleSuite$|rdd\.LocalCheckpointSuite$|deploy\.SparkSubmitSuite$|deploy\.StandaloneDynamicAllocationSuite$).*'
whereas in Apache Spark the tests are run via SBT:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/ https://github.com/apache/spark/blob/bbd163d589e7503c5cb150d934e7565b18a908f2/dev/run-tests.py#L527
[info] Running Spark tests using SBT with these arguments: -Phadoop-2.6 -Phive -Pyarn -Pmesos -Phive-thriftserver -Pkinesis-asl -Dtest.exclude.tags=org.apache.spark.tags.ExtendedYarnTest test
There are subtle differences between SBT and Maven in how test are run (largely around dependency resolution) so for maximal compatibility with Apache we should be running with SBT.
This has been causing problems in the Palantir Spark repo where we cherry pick these commits into.