Closed imranbarbhuiya closed 1 year ago
Hi @imranbarbhuiya thanks for sharing the idea here. If we could dynamically load compression libraries that would definitely be much better for browser load performance. Unfortunately because the compression libraries are all sync, there's no way to use a dynamic import()
to defer that load, only on the async paths if we could disable the sync compression stuff. Eg if we could split the library into zlib/promises
and zlib
separately, then zlib/promises
could defer loading of the individual compression libraries using this technique. Unfortunately that is not how Node.js works though!
Short of that the best option would be to use an env var and optimize out code you know won't be needed. If that would help I'd be happy to add one, but it wouldn't be a feature JSPM itself would take advantage of.
I don't think env var will be a good solution for all these packages. Maybe esbuild should fix the tree shake issues instead. And I don't think modifying jspm public API for only 1 package is worth it so I'll try if there's any other way to do it without changing jspm API for my usecase only. Thanks ❤️
Hi, thanks for maintaining this excellent lib. I have an esbuild plugin that uses this lib. In https://github.com/GoogleChrome/lighthouse/pull/15405, it was used to replace the existing unmaintained plugin. But since
zlib
is a huge package and they are using onlygzip
fn, they want to exclude thepako/lib/zlib/inflate.js
file. Since esbuild doesn't remove unused exports so this is done by changing the contant of this file. And sincejspm
bundles everything in 1 file, we can't apply same hack with my plugin. Is it possible to split long polyfills into multiple files?Alternative
Add all the deps as main dependencies instead of dev deps and add
src-browser
exports inpackage.json
so we can use them directly