Closed isaacs closed 10 years ago
I see uses of force publish, but the security concerns far outweigh convenience. So :+1: from me
We've recently had to re-publish jQuery 1.11.0 & 2.1.0 packages since npm publish
did the publish but didn't properly register the tarball so doing npm install jquery
was actually breaking.
I guess we could publish to 1.11.0+1
or similar in such cases though that would create a confusion for jQuery users; they would need to understand why there is no 1.11.0
package on npm
(whereas there is such a package on bower & when distributed as a source file) but 1.11.0+1
.
Definitely +1.
I also sometimes need to republish some package. Anyway I do it on a couple minutes after publish. So would be great if on period about half an hour (or 24 hours) there was ability to republish package. If someone do not republish package in time, package is frozen up.
@coderaiser I used to have the same requirement until I found that I could do npm install /my/local/module/path
. I had some test code which required my module to be published to the npm registry. If anything went wrong, I had to modify it, and publish it again. Now I can do a local npm install for such test. So I can be quite confident when I do the real npm publish
. I'm not sure whether you are in my case, and I hope this could be helpful.
I guess we could publish to 1.11.0+1 or similar in such cases though that would create a confusion for jQuery users
You can't. AFAIR, npm is cutting out build metadata before publishing, so it'll be the same old 1.11.0 which will fail.
@elgs, Well it's a good point, I didn't know about this feature. Anyway without force unpublish life would be much more stable.
+1 on a grace period that allows to force publish for something like an hour to fix stupid mistakes and random errors.
Like how npm and github all have slightly different markdown parsers it is a bit too much to have to bump a version just to fix some markdown.
Also very handy when publishing had a hickup (like not picking up the README.md or other random glitches).
This would also be safer then allowing a certain number of republishing as the security risk is limited to the short grace period.
In our specific jQuery example publishing was successful and we didn't notice the problem until some time later when we got a bug report about the package. Bumping the version from 1.11.0 to 1.11.1 in such a situation is not really a solution - we're primarily a client-side library so we wouldn't release a new version just because publishing on npm failed. It would be confusing for users to understand that 1.11.1 on npm is the same as 1.11.0 outside; we'd also need to skip 1.11.1 on the next patch release. This is a lot of problems.
If you remove the -f
option, I hope there will at least be a process we could go through in case of such (maybe rare but still happening) problems to be able to republish.
I came here to hate everything … then I've read the whole thread and changed my mind.
package versions as if these were MD5 or SHA1 version of a binary :+1:
I still believe what @mzgol asked is more than reasonable: a process for major libraries or even some npm
gotcha/problem that goes through "manual" updates or it's supervised and rare enough to allow the force … maybe keeping the -f
flag with a --because=reasons
and a sort of quick human admin yep/nope from npm
would be great?
I understand you guys don't have an help desk though so … no way we gonna do that would be an acceptable answer.
my 0.02
Thanks @isaacs for this change, I have had situations before where my apps just stop working and was mainly because of a silent "publish -f" someone did on a module, so I am happy that this will not happen again :+1:
Me: Okies let me do a npm publish -f
and @isaacs:
++many. Been broken by a republish and it was extremely annoying.
I'm all for removing npm publish -f
. We need the package manager to be reliable more than anything else. Having versions be immutable is a major contributing factor to reliability.
I do understand that sometimes a deployment might go wrong. What if there was a grace period (1h?) while -f
was allowed.
Alternatively keep -f
but somehow verify shasum automatically so that if the dependency changed the npm install failed. (lockdown
does this but you must opt in).
Strikes me many of the concerns driving use of npm delete
could be handled with a hypothetical npm deprecate
, e.g. persuading upgrades for security reasons.
npm install
would loudly complain about deprecated modules, but still proceed. The registry would also display warnings about a module's direct or indirect dependency on a deprecated version. (Does it do so on deleted versions?)
Would that take enough pressure off npm delete
to ease some of the --force
"workaround" issues raised above?
Alternatively keep -f but somehow verify shasum automatically so that if the dependency changed the npm install failed
That's shrinkwrap's job. Sadly, it doesn't verify shasums, but I hope it will be changed eventually.
Sadly, it doesn't verify shasums, but I hope it will be changed eventually.
Yes, it is planned. Requires some extensive cleanup that hasn't risen to the level of getting done yet, sadly.
This is now done and pushed live. And the latest npm release doesn't even allow you to try to overwrite a version, even with --force
.
@isaacs so the short one-hour (or whatever) grace-period didn't make the cut?
Is it out of undesirable complexity, or maybe just a bad idea? (I ask out of interest in the reasoning)
@Bartvds Please read the thread - https://github.com/npm/npmjs.org/issues/148#issuecomment-34938582
A time-based approach is problematic because it relies on PUTs being near-in-time to a user typing npm publish
. Since we routinely replicate these docs all over the place, that is not a reasonable solution.
Also, since CouchDB does not treat time as a first-class thing, this could potentially open security holes.
Unpublishes are rare anyway. If someone truly needs this, then they can reach out for admin help. If we start to see a lot of support requests for this, we can perhaps work something out that is more automated.
yeah, the concept of a time is in direct conflict with offline conflict resolution schemes which CouchDB attempts to support.
-1 to unpublish. If something goes wrong, just simply increase the minor version number by 1, and publish it again. There is huge (actually infinite) integer space for the version numbers.
Thanks for the clarifications, makes sense (TIL :)
@elgs If only disk space were as unlimited as integer space :)
Unpublished releases still seem to show up with npm info
. Is that desired behavior?
Primarily for archival purposes: there have been cases when package authors unpublished their packages without considering the impact such action would have on everyone depending on them.
For example grunt-contrib-jasmine-node
was unpublished and republished as grunt-jasmine-node
without any warning. This caused lots of broken builds around the world.
It's quite hard to prevent people from doing stupid stuff without disabling features that are occasionally needed. Maybe displaying a warning and asking maintainer to confirm the unpublishing by typing the name of the package would raise awareness about the possible dangers of unpublishing.
I know it's difficult wrt Couch, but honestly download counts should come into play here imo. If something's never been downloaded, deleting it shouldn't be restricted - i should be able to erase a package-version from existence that nobody's used.
You can do exactly that, unpublish still works.
You just can't later re-use that number to mean something else.
@glenjamin I'm arguing that unpublishing can screw many people if not communicated well and done carefully for packages that people depend on. Because of this the maintainer unpublishing a package should go through extra hoops and we warned of consequences.
Oh boo. This change just bit me. I almost only ever use unpublish / publish to a) fix docs, b) add a tag to a pre-release. (We really could use a better way of doing pre-releases...).
So I just released a project generator with in a beta version, but messed up the tag. ...Because it's --tag foo
and not --tag=foo
as I expected. Sigh. I immediately unpublish and re-publish. No, I didnt know about npm tag
...
You know what happens now. npm refuses to let me re-publish... the exactly same package. Which is unfortunate because the version numbers of this package and the framework it belongs to are supposed to be in sync! (...Unless there is a genuine blocker that needs fixing.)
So yeah, something like npm restore
would be incredibly useful right know, if only for the specific use case of people like me who didn't know about this change.
Should the advice in the FAQ on checking in node_modules for deployables be updated? I.e. Assuming a decent continuous integration/deployment pipeline, can we now consider shrinkwrap a sensible option for deployables too?
I consider shrinkwrap workable at this point, though once checksums are added, that'll be all the better.
@raoulmillais There are still advantages to checking in deps for deployed artifacts, but really, that's a very subtle set of tradeoffs that you must make for yourself. Many folks just trust version numbers as-is, or use shrinkwrap, or do a pre-deploy step that saves and verifies things, etc.
@MarcDiethelm --tag=foo
should absolutely work fine. What version of npm are you using?
You know what happens now. npm refuses to let me re-publish... the exactly same package.
It's not exactly the same, though. It has a different checksum, different modified date, different published date. You may say it's the same, and I believe you, but leaving that door open is dangerous.
What package? What version? I'll unblock it for you.
This is a very unfortunate change that I hope gets revoked. If want to change any metadata or readme information, I shouldn't have to version bump. If I change a paragraph in my readme, I wouldn't bump the version in git, but now I have to just to get npm to publish correctly.
@mike-feldmeier Working on a npm doc-publish
command (or some such) that will JUST update the documentation.
Does npm doc-publish
mutate the README in the package.json
and thus on npmjs.org website ?
@Raynos It'd update the root doc on the registry, and on the website, without doing another publish or requiring a version bump.
I've just gotten caught out by this. I understand the reasons behind this (and agree with it completely), but it seems strange that you can unpublish a package (thus breaking other people's stuff) and then not put it back.
How can you get around that?
@riggerthegeek Don't unpublish unless it is critically important to remove something from the registry. Publish the update with a bumped patch version, so they're likely to be able to pull it if they are not pinning their versions explicitly. If they ARE pinning their versions explicitly, then they'll probably be better off being broken (and noticing!) rather than silently getting changed bits.
@isaacs "npm cache clean foo". it works! thank you
I have to say, this is quite unfortunate that I can't republish an existing version.
Here's my story.
A user has reported that he can't install my module (jsonix):
https://github.com/highsource/jsonix/issues/6
After short investigation it appeared that the tarball for the 2.0.1 version is missing:
https://registry.npmjs.org/jsonix/-/jsonix-2.0.1.tgz
I'm no npm pro so I googled and came upon an advice "unpublish the republish an existing version". So I did unpublish and then could not publish it again.
Well, great. But when I tried to publish a new version (2.0.2), I've started getting 502 error on publish:
https://github.com/npm/npm/issues/4878
The thing is that I'm getting new INVALID version in the registry even if publishing fails. But I can't republish existing invalid versions. I am forced to bump the version. It's already 7th or 8th version I had to bump in an attempt to publish a version of my package.
Is there any way to allow these invalid versions to be republished? I actually just want to publish the 2.0.1. I don't need all these version 2.0.2 and above, they appeared only due to the failed publish attempts.
@highsource in your case publish -f
is not a solution but a workaround to a different issue that should be solved. It should not be possible to successfully publish a package and not have it available for download. At the same time, if publish is not successful due to some internal issue (symptom http 5xx response), the version number should remain available for republishing.
@highsource the situation you're describing is totally valid and 100% understandable. It's for that exact reason (i.e. a missing tarball for a specific version) that we encourage you to contact us directly at support@npmjs.com. It's our experience that only the tiniest percentage of people run into this situation, and thus we make special exceptions for them.
As for the 502 errors - please upgrade your node and npm instances (they're rather behind, unfortunately). If you need your node and npm versions to remain at their current versions, let us know in the support email you send us.
Send us an email and we'll fix it all up :-D
@IgorMinar You're totally right. I've reported the other issue here: https://github.com/npm/npm/issues/4880 Already closed since the problem does not persist anymore with 1.4.3. @rockbot Thank you for the response! I could successfully publish with npm 1.4.3 and the latest node.js. I'll send a support mail in a few moments. Thank you people for your kind help. :)
@isaacs Not sure if it's related to this fix, but I just found I can no longer republish an entire module after I had just unpublished it. npm unpublish
is giving me Error: forbidden already unpublished
while npm publish
is throwing Error: forbidden Cannot replace previously published version: 0.0.1
. Any ideas on a workaround?
@blakeembrey You need to bump a version, did you do it? Otherwise it has to fail after this change, it wouldn't make sense otherwise.
@mzgol I can bump the version, but seems counterintuitive since the module is completely gone.
@blakeembrey It's explained here: http://blog.npmjs.org/post/77758351673/no-more-npm-publish-f
@isaacs Awesome, thanks. Makes sense, just following the errors didn't make sense and it seemed like it could be a cache issue or something. Maybe an update on the error messaging could help (as well as better warning/link when unpublishing).
I had only published for a split second before unpublishing it because of a tiny typo. This was 0.0.1 so I figured it hardly mattered since I could just reset git and republish the module fresh with the correction.
In b054e4866c1121a5157b7306ac07fe6d753b1ddc and 516262d4f39949ab1f3d55b609ad3bb30c0c36b4, the ability to publish over a previously published version is removed. This hasn't been published live yet, but it has been tested extensively, and I think it's a Good Thing.
With this change, if you publish
foo@1.2.3
, you can still un-publishfoo@1.2.3
. But then, you will not be able to publish something else to that same package identifier. Not now, not never. Deleting documents entirely will be forbidden (though of course old revs still get compacted out) and we'll have a lot more visibility into what each change in the db does.I wouldn't characterize it as a security fix, per se, but it does reduce a significant gotcha that users have run into over the years, which is especially annoying when deploying Node servers to production. If the bits are gone, ok, it's an error, you notice and deal with it. But if the bits are changed, then this can be really hazardous.
I'm about 85% sure that this is the right approach. It enforces a best-practice that is pretty much universally agreed upon. But, since it is reducing functionality in a way, I think it'd be good for people to speak up if they have a use case that'll be impacted.