apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

Develop and build Kubernetes modules in isolation without the other Spark modules #570

Open echarles opened 6 years ago

echarles commented 6 years ago

What changes were proposed in this pull request?

As a developer, I like to load in my IDE and to go via the command line to the module I am working on without having to deal with the other modules. So far, with Spark K8S, we need to build from the root of the module hierarchy.

I have introduced:

How was this patch tested?

So far, with those changes, I can successfully build and test any module or combination of modules (install, test, integration-test)

mccheah commented 6 years ago

Not too familiar with Maven, but I think this makes sense.

mccheah commented 6 years ago

Do we have a precedence for this, like in another module perhaps?

echarles commented 6 years ago

A lot of maven projects use that constructs that give 2 advantages:

  1. You can build, test and load in the IDE a subset of your module without the rest (let's say here the modules in resource-manager/kuberunetes).
  2. You can define e.g. build plugins, profiles... that apply only to your sub-modules.

I just realize that the Spark code base don't use that construct where it could, so I am afraid someone would come and say it is not inline with the Spark project habits.

Also, this PR integrate some definitions to run the integration-tests from its directory. As a separate repository is going to be created, this may be less interesting...

ssuchter commented 6 years ago

retest this please