chrisbateman / webpack-visualizer

Visualize your Webpack bundle
http://chrisbateman.github.io/webpack-visualizer/
MIT License
1.69k stars 93 forks source link

gzip sizes #23

Open matthiasg opened 7 years ago

matthiasg commented 7 years ago

This is a nice plugin, but it should be possible to take the actual files content (not just the size) and run an on-the-fly gzip (e.g using zlib) to calculate the correct gzipped size (at least a better approximation) ? Does the code use the file content or just the sizes ?

zlib.gzip(code, (err, zipped) => { // zipped.length })

chrisbateman commented 7 years ago

So there are some challenges here. First off - I'm trying to keep this as consistent as I can for both the plugin and the web version. In the web version - all we have to go on is the stats.json, which includes:

It does not include the actual source of the output bundles - which means that we don't have anything to actually run gzip on. And even if we did - the best we could do is gzip the entire bundle and then use the post-loader size percentages to estimate a gzip size for each module. This is what I'm doing for the minified sizes already.

It might be possible to work around some of these issues in the plugin version alone - since it can get access to the actual bundle source. Some other tools like webpack-bundle-analyzer do some clever things like parsing the bundle source and figuring out where each module is located in it - which yields a very accurate minified size. However running gzip on those modules one-at-a-time yields an overestimate since the bundle as a whole gzips much more efficiently. I'm thinking though - that multiplying the percentage by the total gzipped size would probably be close enough.

So that might be an option for the plugin version...

matthiasg commented 7 years ago

@chrisbateman wow. great to see such a response. I wasn't aware that there is a client side version of this (I forgot about the hosted version with the upload ability) which obviously does not see the source anymore. But the work you did do, and so quickly, is spot on. An estimate on gzip per file is pretty good.

Of course the entire pack might more efficiently pack, but the reason I asked for source level compression is to detect the impact of included code that is harder to compress (eg embedded base 64 encoded jpegs or similar which usually dont compress as well as the rest). Of course these cases are rare, I just happened to have something like this.

matthiasg commented 7 years ago

sorry did not want to close it, in case you might want to track this

valscion commented 7 years ago

Hiya! Do you think we could combine our efforts for this idea and webpack-bundle-analyzer somehow? I feel like a unified solution to get the stats out from webpack builds would benefit us both, and the reporters could diverge freely if we get to separate the plugin code and the reporter from each other.

I'm looking at solving https://github.com/th0r/webpack-bundle-analyzer/issues/32 right now, and would want to collaborate with other npm packages trying to solve the same issue

chrisbateman commented 7 years ago

@valscion The main complication here is that this is both a plugin and a website - and the website only gets the stats.json data.

What if we had a module that was soley responsible for stats extraction - and it could accept either stats.json alone - or that plus the other info you can get from from a plugin?

valscion commented 7 years ago

It seems that webpack-bundle-analyzer needs access to the filesystem to get other sizes besides the parsed one. The package has a CLI with it that can be fed a stats.json file but unless also given a bundle directory location, it can't tell gzip and minified sizes, too.

I'm not sure how everything works under the hood but I'll see if we could do with just the stats.json if it would contain the sources, too