Closed samikrc closed 4 years ago
Will update develop pom version to 2012.2-SNAPSHOT for this.
I have created a new branch with all of the above: feature/scala-spark-upgrade The ran all the tests (there is now a script to run all the tests without docker - see the README in this branch), and the same tests are failing:
Failed Tests:
systemTests.BinaryNBTest
systemTests.MultiIntentDTCVTest
systemTests.MultiIntentLRRandomSamplingTest
systemTests.MultiIntentLRStratifiedSamplingTest
systemTests.MultiIntentMLPCVTest
systemTests.MultiIntentNBCVTest
systemTests.MultiIntentRFCVTest
systemTests.MultiIntentSVMCVTest
systemTests.MultiIntentSVMHyperBandTest
systemTests.MultiIntentSVMStdMetricsTest
functionalTests.PreprocessingTest
@Udhay247: If you are working on this, please check if everything looks alright other than this. Will log a separate bug for this.
Completed and merged to develop. Closing.
Scala 2.11.x has pretty much been depreciated everywhere. Although Spark is a bit late in the Scala 2.12.x party, now that the support is there, we should move it Scala 2.12.x and the new Spark version (2.4.6).
Needs updating the scalatest related modules to 2.12.x as well.