Closed axelfontaine closed 3 years ago
+1 for maven and also bonus question: is there any Nexus for adopt-jdk?
We would like to have proxy repository in our Nexus and fetch AdoptOpenJdk binaries from there -- by Nexus proxying to github.
Problem is that there is NO url browse-able starting from here => https://github.com/AdoptOpenJDK/openjdk8-binaries/releases/tag/jdk8u212-b03
most probably because github is not serving binaries like it could be served via maven/nexus repository
Maybe there are some workarounds?
+1 for maven and also bonus question: is there any Nexus for adopt-jdk?
We would like to have proxy repository in our Nexus and fetch AdoptOpenJdk binaries from there -- by Nexus proxying to github.
Problem is that there is NO url browse-able starting from here => https://github.com/AdoptOpenJDK/openjdk8-binaries/releases/tag/jdk8u212-b03
most probably because github is not serving binaries like it could be served via maven/nexus repository
Maybe there are some workarounds?
Please see api.adoptopenjdk.net
+1 for maven. I would also offer some help getting there as I already publish stuff under org.adoptopenjdk
already..
Need to research if any other OpenJDK JDK's and JRE's and see if anyone else does it.
I found this old project https://github.com/alexkasko/openjdk-unofficial-builds Artifacts are available on Maven Central : https://repo1.maven.org/maven2/com/alexkasko/openjdk/1.7.0-u6-unofficial-b24/
Is maven central really the right place? Why do you want to have a JDK binary as a dependency in a Maven build? It can be published to maven central but will anybody really use this? If you want a second place to store the binaries and provide it for download maybe there are better options. As far as I know Bintray provides binary repositories for example. Here you can add additional metadata and have some other nice options - like sync your versions to maven central :D
@hendrikebbers
Why do you want to have a JDK binary as a dependency in a Maven build? It can be published to maven central but will anybody really use this?
Reading the OP before asking questions in a thread might be helpful.. IDK At least I have the same reason to ask for this as OP
We would like to have proxy repository in our Nexus and fetch AdoptOpenJdk binaries from there
ok, but this just proxy it. The general question is: Why do you want JDK binaries as maven dependencies? Or do you have them in your nexus for local downloads? In this case I assume that there are better solutions than maven central.
Simplest case: you build an assembly with Maven for an end user that needs to include JRE
since we already have it, there is the potential to put them in jfrog artifactory
Having said this, almost certainly you could use something like https://github.com/maven-download-plugin/maven-download-plugin as an alternative method of obtaining a binary rather than requiring it to be an actual maven dependency.
First you can download anything via the maven-download-plugin. That is not an argument not to use maven dependency mechanism for your projects dependencies. (Not to mention using download-plugin together with assembly-plugin requires much more typing).
Second we need to mirror the download source. Per out policy external dependencies have to be mirrored. Mirroring the dependency with Nexus is the simplest solution as we already do this for all other project dependencies
In addition to what @jakub-bochenski already mentioned, I just wanted to add that using the download plugin is generally a poor choice as it doesn't let you take advantage of caching in the local Maven repo as well as all other facilities dependencies enjoy out of the box, including great integration with the dependency plugin (unpacking) and the assembly plugin (repackaging).
It would certainly help if those in favor of making JDKs/JREs available via Maven could outline the requirements and what it would entail. For example: Are there any sample projects already bundling JDKs/JREs (I want to see code)? What are they doing with those? Is it required to provide special integration, e.g. inheriting a SSL trust store from somewhere? What about dependency declarations, e.g. have the JDKs/JRE to be available on Maven Central so that other public packages can depend on those? Do we have to repackage the tarballs in another format? Is this use case still valid taking into account that the JREs are deprecated?
In the end, someone has to support that long-term. That person should ideally use this feature himself/herself. Otherwise it's going to break pretty fast.
@axelfontaine @jakub-bochenski ok, I understand your use case and the benefit of maven (local cache & handling with available plugin). I assume it will be quite tricky to define the coordinates since there are several parameters that needs to be but in the artefact classifier / artefact id (operation system + architecture, J9 or Hotspot, JDK or JRE ...). Next to this I assume that all artefacts must be provided as ZIP (or can the dependency plugin handle tar.gz?). JavaFX provides a maven plugin to define the dependencies in a os-independent way and than at build time get the os specific dependencies. I assume something like this must be done if you want to have a os independent build. Any thoughts on that?
Code has been asked for. Provided in the attachment.
In the following, I'll also try to be more concrete as to how publishing to Maven Central can be accomplished.
The most "difficult" thing about publishing binary artifacts from the AdoptOpenJDK is getting agreement on how to use the Maven coordinates system. This obviously influences how it will be be used by the end-consumer. I propose the following:
groupId
: always 'net.adoptopenjdk.openjdk'artifactId
: Pattern : [jre|jdk].<jvm-type>
, so for example 'jdk.hotspot' or 'jre.openj9'. JVM-type is called 'variant' in the AdoptOpenJDK world and JRE/JDK is called 'binary_type'.version
: A string like '8u232' or '11.0.5_10'classifier
: Pattern: <arch>_<os>[_<buildqualifier>]
, where 'buildqualifier' is just some arbitrary stuff which may be appended for special purpose, for example it may be 'XL' so signify Large Heap builds. In any case values look like the following: 'ppc64le_linux', 's390x_linux', 'sparcv9_solaris'. 'x64_windows', 'x64_windows_XL', etc.Nearly all of this can be build using the JSON variables which is published by AdoptOpenJDK on Github. It is only what I call 'buildqualifier' which needs some attention. It seems to have been added as an afterthought in the AdoptOpenJDK world.
All in all, the proposed naming is here (Excel sheet with examples). I've opted for a proposal where the main published artifact is a text file (which basically just says: "Nothing to see here") and then all the bin packages are in so-called secondary artifacts. This means that consumer must specify a type
and a classifier
when using the bin package in a maven build. It will look like this:
<dependency>
<groupId>net.adoptopenjdk.openjdk</groupId>
<artifactId>jdk.hotspot</artifactId>
<version>8u232</version>
<classifier>x64_linux</classifier>
<type>tar.gz</type>
</dependency>
(Gradle users and SBT users can of course consume too)
I've opted for a model where the target is Maven Central. Of course, the AdoptOpenJDK org can have its own Maven server and publish there, but Maven Central is by far the easiest for consumers because it doesn't need extra config for the consumer. Also, there's a certain amount of vetting being performed on Maven Central so it is more "acceptable" to consume from in many corporate networks.
The procedure for publishing to Maven Central requires some one-time steps:
net.adoptopenjdk
. Sonatype is the gatekeeper for Maven Central. Event though a Maven repo is merely a HTTP server, the easiest way to publish into it is by using - you guessed it - Maven. My example script does it all from the command line, and it uses the Maven GPG Plugin because this is the easiest way of accomplishing the two steps required, namely signing the files and actually uploading them -- all from the command line.
This package contains my example script .. which actually works. Simply customize the top-of-file variables in the publish_to_maven_central.sh
script and then execute it with the arguments as specified in the header.
Perhaps I should mention that my script stops short of automating the push from staging repo to actual Maven Central. This is deliberate. Publishing into Maven Central is not something which can be retracted (at least not easily). Therefore you would want to manually inspect what will be published, using the UI at https://oss.sonatype.org, before actually releasing it to Maven Central. .. until you get the hang of it. The manual process is described here. Steps for automating - if you feel bold enough - can be provided.
The script works as-is. But it is meant as an example. There are a zillion ways to do this and it is all about finding the way which best fits with the existing infrastructure and environment at AdoptOpenJDK.
We should discuss if an upload should be done to bintray instead of MavenCentral. Bintray provides an option to sync uploaded artifacts to MavenCentral. By doing so we would have uploaded the artifacts to both systems that are the defaults for Gradle & Maven.
Von meinem iPhone gesendet
Am 27.10.2019 um 19:06 schrieb lbruun notifications@github.com:
Code has been asked for. Provided in the attachment.
In the following, I'll also try to be more concrete as to how publishing to Maven Central can be accomplished.
Naming
The most "difficult" thing about publishing binary artifacts from the AdoptOpenJDK is getting agreement on how to use the Maven coordinates system. This obviously influences how it will be be used by the end-consumer. I propose the following:
groupId: always 'net.adoptopenjdk.openjdk' artifactId: Pattern : [jre|jdk].
, so for example 'jdk.hotspot' or 'jre.openj9'. JVM-type is called 'variant' in the AdoptOpenJDK world and JRE/JDK is called 'binary_type'. version: A string like '8u232' or '11.0.510' classifier: Pattern: [_ ], where 'buildqualifier' is just some arbitrary stuff which may be appended for special purpose, for example it may be 'XL' so signify Large Heap builds. In any case values look like the following: 'ppc64le_linux', 's390x_linux', 'sparcv9_solaris'. 'x64_windows', 'x64_windows_XL', etc. Nearly all of this can be build using the JSON variables which is published by AdoptOpenJDK on Github. It is only what I call 'buildqualifier' which needs some attention. It seems to have been added as an afterthought in the AdoptOpenJDK world. All in all, the proposed naming is here. I've opted for a proposal where the main published artifact is a text file (which basically just says: "Nothing to see here") and then all the bin packages are in so-called secondary artifacts. This means that consumer must specify a type and a classifier when using the bin package in a maven build. It will look like this:
net.adoptopenjdk.openjdk jdk.hotspot 8u232 x64_linux tar.gz Where to publish to ?
I've opted for a model where the target is Maven Central. Of course, the AdoptOpenJDK org can have its own Maven server and publish there, but Maven Central is by far the easiest for consumers because it doesn't need extra config for the consumer. Also, there's a certain amount of vetting being performed on Maven Central so it is more "acceptable" to consume from in many corporate networks.
How to actually do it?
Preparations
The procedure for publishing to Maven Central requires some one-time steps:
AdoptOpenJDK org needs to register with Sonatype's JIRA -- basically getting credentials and claiming the name space net.adoptopenjdk. Sonaytype is the gatekeeper for Maven Central. AdoptOpenJDK needs a public PGP signature and the signature must be published a public keyserver. I would guess this may already be the case? If not, there is a decent guide here. Performing an upload to Maven Central
Event though a Maven repo is merely a HTTP server, the easiest way to publish into it is by using - you guessed it - Maven. My example script does it all from the command line, and it uses the Maven GPG Plugin because this is the easiest way of accomplishing the two steps required, namely signing the files and actually uploading them -- all from the command line.
This package contains my example script .. which actually works. Simply customize the top-of-file variables in the publish_to_maven_central.sh script and then execute it with the arguments as specified in the header.
Perhaps I should mention that my script stops short of automating the push from staging repo to actual Maven Central. This is deliberate. Publishing into Maven Central is not something which can be retracted (at least not easily). Therefore you would want to manually inspect what will be published, using the UI at https://oss.sonatype.org, before actually releasing it to Maven Central. .. until you get the hang of it. The manual process is described here. Steps for automating - if you feel bold enough - can be provided.
The script works as-is. But it is meant as an example. There are a zillion ways to do this and it is all about finding the way which best fits with the existing infrastructure and environment at AdoptOpenJDK.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.
@hendrikebbers I must admit I'm not too familiar with Bintray Jcenter.
Bintray Jcenter isn't "a default" for Maven. Maven consumers would have to configure it explicitly. Bintray Jcenter proxies Maven Central artifacts, while the opposite is not true. Yes, I can see there's a sync option, but as far as I can tell, it requires intervention. Even if it can be made automatic (can't tell if that is the case?), it would require AdoptOpenJDK to have two accounts: one for Jcenter and another for OSSRH and go through the initial setup with these accounts with both sites.
My conclusion is: Just publish to Maven Central. Bintray Jcenter users will be happy too as it automatically becomes available via that channel also. Two birds with one stone.
Let me know if I've misunderstood.
@lbruun Bintray JCenter is the default for Gradle as Maven Central is the default for Maven. So it would be perfect to have the artefacts in both repos. Bintray offers a sync that uploads the artefacts from Bintray/JCenter to Maven. If we do it that way we only need to upload the artefacts one time & only need 1 configuration in the build script. The upload to Maven Central can than be easily done in Bintray. See https://blog.bintray.com/2014/02/11/bintray-as-pain-free-gateway-to-maven-central/
But on the other hand I was not aware that JCenter mirrors Maven Central. That's new to me... So maybe Maven Central is good enough. Let me have a look :)
Bintray JCenter is the default for Gradle as Maven Central is the default for Maven.
No, it is not.
Gradle has no default.
You either define repositories { mavenCentral() }
or repositories { jcenter() }
to use the one or the other.
If the Gradle you use has a default repository of JCenter, you probable have a custom Gradle distribution with an init script that adds it or similar.
But on the other hand I was not aware that JCenter mirrors Maven Central. That's new to me... So maybe Maven Central is good enough. Let me have a look :)
Yes, it does. :-)
An option which is perhaps very easy to implement is to publish to GitHub Packages - since AdoptOpenJDK already has its binaries on GitHub.
For the Maven consumer, it would mean configuring an additional repo, e.g.:
https://maven.pkg.github.com/AdoptOpenJDK/openjdk-binaries
So yes, publishing the AdoptOpenJDK binaries to Maven Central is still preferable over publishing to GitHub Packages, but the latter it certainly also a vast improvement compared to today.
(GitHub Packages was introduced in May 2019. I must admit I was not aware of it until last week :-))
If they are compatible to Gradle now. In the past they were not.
@Vampire. This thread is about publishing in the Maven format (i.e. into a Maven repo). The Maven format can be consumed by Maven clients and Gradle clients alike. Are you saying that GitHub somehow managed to do a Maven Repo implementation which cannot be consumed by Gradle clients?
That's exactly what I'm saying.
Well, GitHub Packages are still in beta, so it is ok if not all works as expected yet. Just something to consider. They didn't respond properly to HEAD
requests which Gradle uses to be more efficient.
I think that currently the OSX artifact has a different structure than the rest (/Contents/Home
), which is a bit of a PITA from a deployment standpoint, since the different structure is an OS convention for users to run the program itself, versus having any impact on whatever it is that depends on a JDK.
FWIW I think the "normal" OS X download can have that structure, but I'd want the dependency to follow the standard layout instead. For a while the OS X artifact matched the rest, structurally, but at some point it was changed, and I vaguely remember that it was to match Oracle's downloads... but that was a while back and there might have been other reasons.
Since dependencies are for devs — and frankly the most exciting thing about AdoptOpenJDK for me is that it nixes the crazy do-this-before-that-step of having potentially very new users needing to "agree", download, and install another separate program prior to running your program... i.e. automation! — I vote heavily that all artifacts be zip files, and follow the same basic structure— at least as far as the executables go.
It seems like at least a few people who are already using java as a normal dependency have commented on this issue— anyone here need or want the dependency format to match the operating system conventions? Or does everyone move the OS X binaries after extraction, as most of the existing JDK download automation I've seen does, and as I've done for way way too long?
If we're looking at Java as strictly a binary dependency, then following the OS conventions seems better, but I don't think we should treat it as such— at least not for the dependency.
IMHO only JDK installers should need to care about OS conventions, and as I mentioned before, what I love about this project that users don't have to jump through hoops and run installer installers anymore.
Besides the layout of the artifacts, the coords are the other thing to hammer out. I think the proposed ones are fine. We could delineate using "group" as well. If you're coming from Maven it might seem a bit odd to have multiple groups for the same artifact, but it works.
In closing, what I'm after here is not a Maven artifact to use with Maven per se, but the standardization that a dependency brings (or should strive hard to). I've been treating Java as a normal dependency with builds for 10+ years, and it's always been a bit ugly or some kind of hack. It would be so nice to not have to "do stuff" anymore, or run mirrors of repackaged zips just so you don't have to add a bunch of if/then logic somewhere(s)... at the very least a standard archive format would be a plus.
@denuno To me it is implied in the topic headline that what would be published in such a repo would indeed be archives, not say installers binaries.
The AdoptOpenJDK project currently publishes:
For Windows: .msi
and .zip
For Mac OSX : .pkg
and .tar.gz
For Linux: : .tar.gz
As a developer who wants to bundle a runtime with an application and treat it as a dependency in my Maven build, I find .msi
and .pkg
uninteresting. No need to publish those as far as I'm concerned.
However, if you want the same archive format to be used across OS'es then there's a slight problem: Linux and Mac OSX would require the execute bit to be set on the executables. Only the tar.gz
format can handle this. ZIP files cannot. On Windows there's no such concept of execute bit flag on files and furthermore the .tar.gz
is somewhat of an alien on the Windows platform. For this reason, it cannot be ZIP for any platform, I think.
@lbruun I think it makes sense to have all the artifacts produced be available via Maven coordinates, including the package manager versions.
If files are being kept around, might as well store them in a format that lots of tooling can consume.
Most might choose to automate using the stand-alone versions, but some folk will want to automate using the packaged versions, and it doesn't seem like there's any reason to have different URI formats for file retrieval.
If the files are stored only in a DB, using tags and not trees, this is all perhaps extra work— but if they're being stored on disk anyway the accessibility is zero cost.
Similar logic goes for using the zip archive format — You can store executable permissions in a zip archive, which is all we need here, and zip is the lowest common denominator.
There's more/better build tooling for zip than tar + gzip, and it's the same format as Jars (and basically archive/bundle data in general)... xz format, say, would be pretty nifty, but zip works on Windows by default, and it's the only one (AFAIK— maybe they've added tar and gz, but that would be recent and maybe only with the nix extension?), so I think zip is the only format for Windows (out of the box), and since all the other OSes can handle zip basically by default as well, I think it makes the most sense as an archive format.
I don't care much about the package format. Maven should be able to handle both tar.gz and .zip, as both have support in the assembly plugin.
I also don't care much about exact coordinates. Maybe I'm missing something, but as long as I can specify JDK/JRE, a version and architecture I'm fine. (Note the group handling might be important for some static vulnerability scanners).
I would ask for the tarball (zipball?) with just plain JRE/JDK first. I think packages with system installers could be added later as a convenience.
I don't care much about exact coordinates either, but I would love the format to be zip if it's an archive, for all regardless of OS, as well as nixing the custom java home sub-directory in the OS X archive.
My use case is provisioning cross-platform programs/builds/whatnot with Java. For years I've been having to repackage the artifacts into just this format — zip with the standard directory layout — because it simplifies a lot of logic. I'm not after a user-facing Java version, I'm after a dependency, so the different archive formats and directory layouts don't do anything for my use case besides complicate matters.
It's much nicer to be able to use a simple list instead of a map and if/then/else statements. Maven and Gradle &c. can handle whatever you throw at them, but the core PITA is, you're doing a zip task for zips, and tar/gz tasks for tar gzip files, and then for OS X you're moving stuff... If you have to have if/then/else, then do it, else don't. (Heh.)
I'd like to see the API downloads all be zips, and maybe even follow the same dir layout as well (use like a DMG for OS X's different layout version, instead of doing that in the archive perhaps?), since I hit the API with some scripts to download the latest and whatnot, and I have to jump through these hoops there.
But they've been this way for a while now, so changing would be rough, whereas these new dependencies are a chance to chime in and maybe have it go the way I think makes the most sense. The main argument against zip might be that it is a bit less common on *nix systems, but there's no guarantee tar and gzip are installed either (except for like OS X), so... ¯_(ツ)_/¯
If I have to mirror stuff for eternity, so be it— I'm super-duper-amazeballs-happy with AdoptOpenJDK, and the API— all of this is just so much less painful than it used to be thanks to this project. A starndard dependency is just more of the same Good Stuff, and will make that repackaging easier if need be (since I use a standard build system to do that bit). The API does the job, but a standard dependency will provide even more accessibility.
100% behind the idea that the JDK archives should be rolled out before the package manager artifacts, and that in general that's what matters the most, and archive format and file layout are not as important as getting an official dependency out there (jcenter is as fine as maven central AFAIC), but I'd love to nix a few ifs with one stone at the same time so to speak, if possible. :)
Now that bintray is dead and buried, can we finally get this onto Maven Central and move on with our lives?
@axelfontaine I don't know why Bintray should have any influence here... I assume that this task is going to end up with me, anyway, otherwise someone would have volunteered by now. But I still didn't see an example project that outlines how those artifacts would be consumed, or an explanation why jlink/jpackage are insufficient. I'm not asking for those things to obstruct this endeavour, but I want to be able to test things and make sure that they work as intended. Because I never had the requirement you and others have, I need that information.
@aahlenst Here is the example I referred to in my original post: https://github.com/flyway/flyway/blob/master/flyway-commandline/pom.xml#L224
The requirement is simple: build and package a CLI distribution (which bundles a Java runtime) for multiple platforms (Windows, macOS, Linux) as part of a single build. This involves downloading the JRE for each platform as a Maven dependency (currently from a private repo, hence this issue), extracting it and packaging as part of the assembly for each platform. The end result is 3 artifacts for this CLI: a Windows zip, a macOS tar.gz and a Linux tar.gz, each of which can then simply be extracted on the desired target platform by end users.
JLink: Not releveant for this discussion. This could be added as another intermediary build step for CLIs shipping with Java 9+. Or not. Either way, this doesn't impact this issue.
JPackage: Only available for Java 14+, no cross-platform builds, produces an installer/distro-specific package (not desired).
I hope this helped clearing things up.
@aahlenst, my two cents:
I don't know why Bintray should have any influence here.
Someone mentioned Bintray as an alternative to Central, which seems to have sent this discussion into a weird direction.
I still didn't see an example project that outlines how those artifacts would be consumed
It's not a full blown example, but here's the outline of what we do at $workplace:
Manually publish Windows AdoptOpenJDK to a local Nexus. I don't have the exact script right now, but it's as bare-bones as can be. The published artifact is the downloaded AdoptOpenJDK ZIP as is, with no modifications. I can try and post the whole publishing script tomorrow.
UPD: It turns out there are modifications - for convenience, the ZIP is repackaged after stripping the (version-dependent) top-level directory. Also, there's some noise about publishing for multiple bitnesses. But it all ultimately comes down to this:
mvn deploy:deploy-file \
"-DgroupId=$organization" \
"-DartifactId=$artifact" \
"-Dversion=$version" \
-Dpackaging=zip \
"-Dfile=$zip" \
-DrepositoryId=nexus-3rd-party \
-Durl=http://nexus/nexus/content/repositories/third_party
This artifact is downloaded at build time, and extracted into a temporary/transient directory.
The build script runs jlink
from the extracted JDK, producing a JVM image for our application (with dynamically detected modules).
The resulting JVM image is then packaged in an executable installer (MSI).
That's about it. I won't go into details, but I can point you towards the SBT plugin I use for running jlink
. Though I don't think it makes any difference.
Basically, all I need from this issue is a ZIP artifact in some Maven repository (so that it can be cached on local Nexus), containing a specific version of the Windows JDK (so that it won't change and break the build), including the jlink
executable.
or an explanation why jlink/jpackage are insufficient
See above - I use jlink
as a part of the build process, but I'd like to have a known version of the JDK containing it, and I'd like that version to be specified by the project rather than by the environment. I haven't used jpackage
(and am unlikely to use it in the future, since I need a fair share of customization in the MSI script), but I don't think it solves this problem either.
Okay. Will do. I don't think it makes sense to start publishing before Eclipse Adoptium (except everyone wants to change coordinates after a single release), so don't expect something official before the July 2021 CPU.
Please find someone else who does this.
Any update on this ?
This won't be happening from Adopt or Adoptium I'm afraid, closing this issue.
Understandable that this is rejected. I no longer see this as a task for JDK distributors.
Adopt/Adoptium have a strict layout for how JDKs are published (i.e. the URLs from where to download). You can "discover" the location to download JDK bundle from by first retrieving information from the Adoptium API. Then as the next step you can in your Maven build use something like the Maven Download Plugin (or similar) to actually do the download.
Another - more generic - option would be to write a Maven plugin around the JDK Disco API. Such approach would work across all JDK distros.
@lbruun As already outlined more than 2 years ago, your suggestion is a terrible idea. See https://github.com/AdoptOpenJDK/TSC/issues/82#issuecomment-521571451
This really should be on Maven Central so that all existing dependency and caching mechanisms of Maven, Gradle or any other tools accessing the repo can be leveraged out of the box. No need to reinvent the wheel here.
I can only second this, this would really be usefull, Maven already has an "API" to query for artifacts understood by lots of tools, please reconsider this as it would allow much better integration that writing "just-another-api", inventing "just-another-caching mechanism" and just another download mechanism" (proxies, authentication, mirroring....)
So anyone that has no direct access to "the internet" but e.g. a nexus-repo is blocked from using the API or even raw download URLs.
It would be very useful if JDK and JRE images would be published to Maven Central. This would allow projects bundling AdoptOpenJDK artifacts to add those as Maven dependencies and pull them from there.
For Flyway we had to publish them to our own private Maven repo. Maven then pulls them as part of the build and the assembly plugin unpacks and bundles them in our CLI distribution.
I am sure we are not alone with this need.