Closed ThadHouse closed 5 years ago
It looks like you could use Artifact Transforms to unzip the dependency and share it between the projects. The samples in the user manual section show an unzip transform which should mostly be what you need. We new native plugins use an Unzip artifact transform as well to extract header dependencies.
As explained above, the new C++ plugins are using artifact transform in since 5.2. It shouldn't be a problem.
Ah ok cool. Any chance that UnzipTransform class can be moved out of internal? So much of the native stuff is internal its very hard to actually extend things properly. If not, that's OK I'll just take the internal dependency anyway, since my plugins already heavily depend on the internals of the compilers already.
Also, trying to implement ArtifactTransform to map to a NativeDependencySpec is completely failing. Any tips or guides on how to do that?
The UnzipTransform class won't be made public. Instead, we want to release APIs to be able to express what you want to do with the dependency. Such that an archive could be tag as the user wanting the archive itself or the content. Gradle would take care of everything else for you.
Unfortunately NativeDependencySpec
are completely separated from the Gradle dependency engine. To consume compressed archived in the software model, you will need to use the Gradle dependency engine and then use prebuilt library support for the native plugins. Previously, I did something similar during configuration time by skipping the download and extraction if the zip was already extracted. I didn't use ArtifactTransform as they didn't exist yet. It worked pretty well. With the ArtifactTransform, you should be able to cut down on the amount of custom code.
Ok. I just ended up using it anyway, and was able to get it hooked into our NativeDependencySpec setup without too much hassle. For us at least, the NativeDependencySpec works better then prebuilt libraries (We've tried both), but it wasn't to bad to get working which is nice.
I am running into this issue in the old Software model, but looking at the source code the same thing will happen in the new setup as well.
I have a multi project build (around 10 projects) that all use the same maven dependency. This dependency is about 100MB and around 600 files. On Windows, this dependency can take upwards of 15 seconds to fully hash and extract in order to be usable by the compiler. Multiply this by 10 projects, this means almost 3 minutes is taken just by extracting dependencies so they can be fed to the compiler. To work around this, I was exclusively adding dependencies to
project.rootProject
which solved the speed issue, but I'm assuming this isn't 100% safe. So I might recommend an official way to combine dependency extractions between subprojects, as otherwise just handling dependencies might cause exponential build times. I don't know how easy this will be able to do with gradle's internals, but with bigger projects its definitely something necessary.Java doesn't have this issue as it doesn't have to extract the files for the compiler, but native does, and especially on Windows file system extraction is very slow.