Open ljharb opened 10 years ago
It looks like the copy-packages code might be assuming everything's from npm.
Indeed that's true.
In fact, as of v0.17.2, despite its name, the copy-packages
command:
This fails in at least 2 scenarios:
npm help install
).npm help link
).Fixing (a) requires inspecting each package's package.json
's _resolved
property, which contains the actual source URL of the package - a URL starting with https://registry.npmjs.org
implies a package from the npm registry, for instance.
Note that a conceptual issue is that such a fix would install the same version as the original package, whereas npm-registry-originated packages are installed as their latest version.
Fixing (b) is simpler: Each global package directory must be tested for whether it is a symlink and, if so, that symlink can simply be copied as is to the new version.
The following commands - based on the current copy-packages
code - show how to obtain a list of non-symlinked package names first, and then a list of symlinked package paths (version selection code omitted for brevity):
# Get space-separated list of *package names* of all REGULAR global packages.
INSTALLS_NPM=$(npm ls -g --parseable --depth 0 |
while read -r f; do [ -L "$f" ] || echo "$f"; done |
xargs basename | xargs)
# Get space-separated list of *full folder paths* of all SYMLINKED-to-local-projects global packages.
INSTALLS_SYMLINKS=$(npm ls -g --parseable --depth 0 |
while read -r f; do [ -L "$f" ] && echo "$f"; done |
xargs)
Generally I wonder whether copy-packages
needs something like an --as-is
option to simply file-copy existing global packages - even though that could result in inoperable packages: copied packages could be too old to work with the new engine, C++ addons could require rebuilding - though that could be helped/detected by automatically running npm rebuild
after file copying.
@mklement0 I'd actually prefer copy-packages
to literally cp
symlinks and packages over, and then inspect all package.json
files, and run their post-install scripts (npm rebuild
may help here).
That's a bunch of work but I'd love to accept some PRs to fix it :-)
@ljharb I see, but wouldn't changing the existing behavior be a concern? If we just copy and run npm rebuild
, then you'll end up with the same, not the latest package versions, as is currently the case - though I agree that copy-and-rebuild is (also) useful and perhaps should have been the default all along.
Not at all; that's what semantic version numbers are for.
I'd probably rename copy-packages
to reinstall-packages
first, and then add a new copy-packages-from
option for this new behavior.
The reinstall-packages
change has been released in v0.19.0
.
Cool, thanks.
One possible approach is advice from @othiym23: "the _from and _resolved fields in the installed package.json will generally give you enough clues to figure out how the package was cached." however, he warns "that's not always user-friendly information, though. you have to have a fairly good grasp of how the caching algorithm works to interpret what those fields are telling you" so we'd need to tread very carefully.
I post this in case there are other like me who are struggling with this issue and have many linked packages.
The find command will echo rm commands for every system linked package.
# Switch back to the old version
nvm use 0.12.7
# Unlink all linked modules
find $NVM_BIN/../lib/node_modules -maxdepth 1 -type l -exec echo rm {} \;
# Run the installer again
nvm install 4.1.0 --reinstall-packages-from=0.12.7
It's not working on v0.26.1.
I thought that was the latest version?
@moander nvm reinstall-packages
and nvm install <new> --reinstall-packages-from=<old>
have supported linked packages since https://github.com/creationix/nvm/releases/tag/v0.24.1, and it works fine on my machine :-)
Would you mind opening up a new github issue? Removing symlinks is a pretty brutal workaround for a likely trivial bug, and I'd rather fix the bug than encourage people to delete their linked modules.
@ljharb I know this is old, but I think nvm install <new> --reinstall-packages-from=<old>
has still an issue.
I was installing a local package to the global packages on my machine (called kapsel):
$ cd /Users/me/SAP/MobileSDK3/KapselSDK/cli
$ npm install -g
...
$ nvm install 5.4.1 --reinstall-packages-from=9.6.1
...
npm ERR! 404 Registry returned 404 for GET on https://registry.npmjs.org/kapsel
Installing this local package globally is required by SAP for the HAT. They don't offer this in an online package, but instead you install it with their setup wizard and link it, globally.
This is the package.json of the local package:
{
"name": "kapsel",
"description": "A helper for creating Kapsel applications.",
"version": "3.15.0",
"homepage": "",
"author": {
"name": "SAP",
"email": ""
},
"repository": {
"type": "git",
"url": ""
},
"licenses": [
{
"type": "",
"url": ""
}
],
"main": "lib/kapsel_commands",
"engines": {
"node": ">=0.9.9"
},
"scripts": {
"test": "grunt nodeunit"
},
"dependencies" :
{
"archiver" : "~1.1.0",
"progress" : "~1.1.8",
"request": "~2.74.0",
"q" : "~1.4.1"
},
"keywords": [],
"preferGlobal": "true",
"bin": {
"kapsel" : "bin/kapsel.js"
}
}
I installed a module
jakl/rbwhat
. When copying packages, I got this error:It looks like the
copy-packages
code might be assuming everything's from npm. This should be smarter and actually check each package.json - for one, if there's no install scripts, the module can just be copied directly to the new location. In addition, it should ideally be able to figure out from where the module was installed.