Closed EvanHahn closed 1 year ago
I don't think this is a good fit for np
. I'm not interested in adding such a dependency or maintaining this kind of code. And the time cost for the user would be too high (Zopli is extremely slow). The correct way to do this is for npm to recompress server-side, but I don't think they are willing to do that.
Sounds good. Thanks for your consideration and hard work!
Description
tl;dr: consider re-compressing archives with Zopfli to shrink package file sizes by ~5%.
npm packages are distributed as compressed archives; specifically, gzipped tarballs. Improvements to the compression of these archives would lessen bandwidth and storage used.
npm publish
(andnpm pack
) compress packages with zlib. Switching to a better compression algorithm, like zstd, would improve compression but nobody could use these packages zstd is not compatible with gzip.Zopfli is a library that can compress data in a gzip-compatible format. In other words, data gzipped with Zopfli can be decompressed "as normal". It usually improves compression over zlib but takes longer on the compression side. It is just as fast on the decompression side, so only package authors would have to wait.
I re-compressed the latest version of several popular packages with Zopfli. Here are the size savings:
I do this for one of my packages and it works well. You can also try this browser-based proof-of-concept for other packages.
np
could compress packages with Zopfli.Advantages:
Disadvantages:
tar
was able to completely compress the archive in about 1.2 seconds on my machine. Zopfli, with just 1 iteration, took 2.5 minutes.Possible implementation
Instead of effectively running
npm publish
, np would effectively run:Alternatives
npm
itself. I proposed this to npm a year ago but it was (rightly, I think) rejected.