apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

Looking for a way to run only the kube unit/integration testing #426

Closed erikerlandson closed 7 years ago

erikerlandson commented 7 years ago

By default, running the documented command ...

build/mvn integration-test \
    -Pkubernetes -Pkubernetes-integration-tests \
    -pl resource-managers/kubernetes/integration-tests -am

... runs all of spark's testing. This not only takes forever, but required me to reconfigure my allowed number of open file-descriptors (ExecutorAllocationManagerSuite seems to be the main culprit here).

If there is already an easy way to do this from the maven CLI, I can't find it. If there isn't, I'm wondering if somebody who is more conversant with maven knows how to reconfigure the poms to make this easier.

erikerlandson commented 7 years ago

cc @foxish @mccheah

ssuchter commented 7 years ago

Is your question how to skip spark’s testing or how to change the allowed number of open file-descriptors from maven?

erikerlandson commented 7 years ago

skip spark's testing. conversely, run only the kube related testing

erikerlandson commented 7 years ago

Hmm, looking at the CI console output, maybe it's this:

-Dtest=none -DwildcardSuites=org.apache.spark.deploy.kubernetes.integrationtest.KubernetesSuite

ssuchter commented 7 years ago

I was just looking into something like that. That feels right…

Sean

On August 11, 2017 at 3:50:22 PM, Erik Erlandson (notifications@github.com) wrote:

Hmm, looking at the CI console output, maybe it's this:

-Dtest=none -DwildcardSuites=org.apache.spark.deploy.kubernetes.integrationtest.KubernetesSuite

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/apache-spark-on-k8s/spark/issues/426#issuecomment-321935462, or mute the thread https://github.com/notifications/unsubscribe-auth/AB2E2wLeQPyhaufmOZCoQAcTVQMzFoHvks5sXNqogaJpZM4O1Kq2 .

kimoonkim commented 7 years ago

Yes, I believe we did that in Jenkins exactly for that purpose.

ifilonenko commented 7 years ago
build/mvn clean pre-integration-test -T 4C -Pkubernetes -Pkubernetes-integration-tests -pl resource-managers/kubernetes/integration-tests -am -DskipTests
build/mvn -B clean integration-test -Pkubernetes -Pkubernetes-integration-tests -pl resource-managers/kubernetes/integration-tests -am -DextraScalaTestArgs=-Dspark.docker.test.persistMinikube=true -Dtest=none -DwildcardSuites=org.apache.spark.deploy.kubernetes.integrationtest.KubernetesSuite
erikerlandson commented 7 years ago

This seems to work, at least for the IT since it is only one test class. Unit tests would need some kind of regular expression .*kubernetes.* Anyway, I can close this