Closed jbyler closed 9 years ago
That would be great! The best option would be doing the same as nvm/nvmw and have a shared repository (on the computer) for node binaries, with different versions of node and npm in it. In that case, you wouldn't even need to have a "node" directory local to the project - the commands could be run directly from the repository. But I think that's a big change, it takes some work and coordination. But it's on the radar!
I don't know if this is what you had in mind, but for https://code.google.com/p/cmake-maven-project/ I use one Mojo to download the binaries, JAR them, and install them into the local repository.
Then when the use invokes any of the goals mentioned on that page, a second Mojo unpacks the binaries from the local repository into target/cmake
(unless they're already unpacked) and executes the goal.
Feel free to use my source-code as reference.
i think that to really fit in with other maven builds its better to not only cache locally, but also work via maven repositories (as a lot of corporate environments have them). so my proposition for the download procedure would be something like:
this plays nicer with existing corporate build environments that want all external artifacts properly cached (in artifactory or nexus, for example)
I have already thought about that (putting the binaries in the m2 repository) but decided against it because I want the plugin to be build-system-agnostic. Use it from maven, sbt, gradle, buildr, leiningen, you name it. That's also why the source code is separated into core and maven-plugin.
I'm working on being able to configure where node should be installed, so you can set it to ~/.node for example.
Maybe you can upload Node binaries directly to nexus and bypass the problem that way? As long as the directory structure is equal to nodejs.org/dist, it should resolve correctly.
i understand the need to keep the maven-specific and generic parts separate (and also have as much of the functionality as possible on the generic end) but nodejs/npm versions change frequently and i dont want the front-end developers to have to bother with manually uploading artifacts.
is it ok if i implement this feature as a separate mojo (not in the generic library module) and submit a pull request?
being as it'll be a separate mojo it wont interfere with any existing code (which will continue to be generic)
I know it's very conservative, but I really want this plugin to be as small and focused as possible. Even if a new version of node is released, it's seldom neccessary to upgrade, unless there's a bugfix that affects your build. If you're very concerned about the availability of nodejs.org, I suggest you manually commit the "node" folder into your repo. If you really need a nodejs mirror, I guess there's some kind of automatic site mirror tool out there that you can host internally, and have it updated against nodejs.org/dist every 24 hours. Then you can point the maven plugin to that repo. Or, if you need it on a plexus mirror, you can make a separate Maven plugin for that?
Den 28. mars 2014 kl. 11:14 skrev Radai Rosenblatt notifications@github.com:
i understand the need to keep the maven-specific and generic parts separate (and also have as much of the functionality as possible on the generic end) but nodejs/npm versions change frequently and i dont want the front-end developers to have to bother with manually uploading artifacts.
is it ok if i implement this feature as a separate mojo (not in the generic library module) and submit a pull request?
being as it'll be a separate mojo it wont interfere with any existing code (which will continue to be generic)
— Reply to this email directly or view it on GitHub.
@eirslett Start with my suggestion and work up from there. I assume you're okay with downloading node from nodejs.org and install
ing it into the local Maven repository? This will speed up builds substantially without changing what files you publish to Maven Central.
@cowwoc +1
I think prevention of downloading every run would be great too +1
(CC @tfennelly) Yes for me too this seems crucial.
@cowwoc +1
+1 In my case , I have a Jenkins job, which runs the Maven build . And I have configured the job to emulate fresh checkout everytime i.e delete the unversioned files and svn update. This removes the local node installation and everytime the node is downloaded again and installed. For such cases this feature would be a real real saviour.
I want the plugin to be build-system-agnostic. Use it from maven, sbt, gradle, buildr, leiningen, you name it.
My impression was that all of these supported the Maven repository format as a de facto standard. Obviously they have different download mechanisms, but I would expect the build system plugin to simply inject a resolver that could convert a GAV into a local file.
(Aether aims to standardize even the download mechanism, though you would still need to inject its settings, logger, and so on.)
That said, @cowwoc’s suggestion would be a great improvement over the current state.
Can't we just use a file:// URI as nodeDownloadRoot/npmDownloadRoot? This URI can the point to a cached resource, an SCM one, the plugin's download part could handle that 'transparently'. The only downside is that you'll have to mimic the node/npm site layout (root + '/v.../' + file), but this can then handle multiple versions of node/npm.
The only thing we're missing at this point is for someone to create a pull request. If you look at commit logs for the past 6 months you will notice that the author is not fixing any issues directly. He is just merging in pull requests.
(And no, I have no immediate plans to produce a pull request myself)
Here you are! Hope to have it merged soon (and eventually a new release available)
@cowwoc yeah, you're right; I don't have too much time on my hands, and this project kind of grew out of proportion. It was supposed to be a simple weekend project ;-) I'm trying my best to merge PRs as they come, and hopefully I'll be able to write more code myself later on! I have this great vision of how I want to download and cache node/npm in ~/.node_npm/, but I can't ever find spare time to actually implement it... :-/
I'll go with the file:// solution for now.
@eirslett Makes sense. Thanks for being transparent about the situation. I've been in your shoes before, so I understand. :)
@eirslett This library is a very important part the hobby projects I am working on but, like you, I don't have time to work on it full-time. I looked around and there aren't really any good alternative libraries.
I understand that you are currently focused exclusively on PRs but I think we can both agree that the project would be in a healthier place if we found at least one co-developer to help out.
Perhaps you could post a notice in README.md: "Looking for someone to take over / help with this project. Contact X if you're willing to help". I've seen other Github projects do something similar. With the amount of interest in this project, I am hoping that someone will step up.
Yes, it could definitely use some more love. But I'm concerned that it will evolve into bloatware - even more difficult to maintain than it already is. The plugin already works for most people, the cases you typically see in PRs are mostly edge cases. It's a real challenge to try and limit the scope of the plugin and at the same time please as many developers as possible. (You cannot have both) It's also becoming a challenge just to keep the code organised as people are adding more features...
@eirslett correct me if I'm wrong, but this is only an issue in a clean environment, right? I checked this and, for me, it only downloads the installer if there isn't already a local copy of node. It does not download every time !!
Imo this is fine for dev, but I do appreciate that it might be a bit of an issue for automated builds that have a env on every run. Though I really think people are exaggerating the significance of that.
Yes, it's only an issue for the first, clean build, the downloaded binary is cached inside the project itself. Downloading node adds ~15 seconds to the build, I think, depending on the download speed. Some people intentionally delete the node executable just to get a 100 % clean build every time, that means they won't get the benefit of a cached binary (unless it's cached somewhere else)
@eirslett Perhaps it's worth splitting the plugin into multiple pieces then? One for installing node/io.js. Then separate plugins for grunt, gulp, karma, etc.
A workaround might be to zip the node directory and install it into your nexus, then use maven-dependency-plugin to fetch and unzip it into your project root before frontend-maven-plugin runs. Just an idea.
One of the reasons people want to download the install is because they have developers on both windows and unix. So checking in the binaries is not an option. Similarly populating a cache is a lot of make work - nexus doesn't support this well and the burden is then on keeping the cache up to date. A better solution would be to do what npm does - keep a cache of the downloads somwhere and use those in preference to a full download. For bonus points make the initial download populate the location specified by nodeDownloadRoot.
@andyp1per "So checking in the binaries is not an option" is not strictly true. You can public the binaries with different classifier
values. That's what it's there for.
That said, I think we should keep such artifacts out of central if we can (we need a local cache, not a global one).
It would be great if the plugin could cache the downloaded node installer in one location, and install it to another location (under target, or in a configurable place; see #18). That way you can install into target and get a fresh, virgin installation every time you do "mvn clean", but you don't have to re-download from a remote internet host on every build. This is the approach used by the cargo plugin.