bazelbuild / rules_scala

Scala rules for Bazel
Apache License 2.0
364 stars 275 forks source link

Support deploy environments (deploy_env) #576

Open hmemcpy opened 6 years ago

hmemcpy commented 6 years ago

(subsumes #560, expands on bazelbuild/bazel#1402)

Bazel has an internal feature called deploy_env for java_binaries, which allows performing classpath subtractions for specified targets (see a comment by Ulf here).

We tested this feature (by cherrypicking the commit to expose it in java_binary) and it is suitable for our needs for specifying different deployment targets having different dependencies (something we can't use neverlink for).

As other people seemed to find it useful as well, I would like to propose adding support for this for scala_binaries. The classpath subtraction can be done in skylark using the code we already wrote, to support deployment scenarios explained in #560.

WDYT? cc @ittaiz

johnynek commented 6 years ago

Do we need to do anything in rules_scala? Why not use java_binary?

We have talked about getting rid of scala_binary or making it only a macro that wraps java_binary. Does his ticket inform that discussion?

samschlegel commented 5 years ago

It looks like support for deploy_env in java_binary has finally landed in master https://github.com/bazelbuild/bazel/commit/a92347e405ebe022a7f216541aaa46753e311563

johnynek commented 5 years ago

did that make it into 0.22?

samschlegel commented 5 years ago

Doesn't look like it

thundergolfer commented 5 years ago

For anyone else arriving here, deploy_env is on track to be released with 0.23 in late Feb 2019. https://github.com/bazelbuild/bazel/issues/6495#issuecomment-460932694

stijndehaes commented 5 years ago

Any updates on this? Would be useful for building fat jars for spark application. I do not want to include spark every time as this dependency is big and already exists on the spark cluster itself

johnynek commented 5 years ago

Did you try using deploy_env with java_binary?

You should be able to use java_binary with scala_library.

johnynek commented 5 years ago

You could also use jar jar to zap the classes you don’t want from the deploy jar:

https://github.com/johnynek/bazel_jar_jar

ittaiz commented 5 years ago

Stijn, What about using java_binary like Oscar suggested back then? I don’t remember why we didn’t want to use it but it does make sense

On Sat, 20 Jul 2019 at 0:45 P. Oscar Boykin notifications@github.com wrote:

You could also use jar jar to zap the classes you don’t want from the deploy jar:

https://github.com/johnynek/bazel_jar_jar

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/bazelbuild/rules_scala/issues/576?email_source=notifications&email_token=AAKQQF3PD3RDIT3EGI6MYNLQAIYW5A5CNFSM4FONJVC2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD2M27JA#issuecomment-513388452, or mute the thread https://github.com/notifications/unsubscribe-auth/AAKQQF7R6LVHEZWCJKKEUNDQAIYW5ANCNFSM4FONJVCQ .

thundergolfer commented 5 years ago

What about using java_binary

This is what we do. It works. The main_class can be 'fake', and you put the provided deps in runtime_deps. You then can use this target like so:

deploy_env = [
   ""//tools/build/spark-cluster-runtime",
]
johnynek commented 5 years ago

The reason we made scala_binary is that it predates “java sandwich” so java could not depend on scala rules. Now that motivation is gone.