databricks / sbt-spark-package

Sbt plugin for Spark packages
Apache License 2.0
151 stars 32 forks source link

Packaging a multi-module project #11

Open anuragkh opened 9 years ago

anuragkh commented 9 years ago

Hi!

I have a multi module project here which I'm trying to publish as a spark package. It has 'core' and 'spark' modules, which would be bundled together in a jar and published.

The README doesn't seem to contain instructions for this -- what changes would be needed to package and publish a multi-module project?

Thanks! Anurag

brkyvz commented 9 years ago

If spark depends on core, and core can be find on a repository (maven, bintray, github), you may simply use spark/spPublish in the sbt console. Please set all related keys (sparkVersion, sparkComponents, spDependencies, ...) under spark. If you need more information on project scoping, please check out the sbt docs.