Closed connorjclark closed 4 years ago
I expected this kind of issue. Getting 100% precise module gzip size seems almost impossible - a module is just part of a file. And gzip applies to the whole file. I still wonder what use of knowing the gzip size of a module when it's bundled with other modules.
Merging range purpose is to collect the biggest string that is part of a module. I haven't tested it enough but I think in most cases it should include the whole module code.
You can see actual bundle gzip size in totalBytes
property
I still wonder what use of knowing the gzip size of a module when it's bundled with other modules.
Our specific use case is we want to minify the download time for a bundle. Because of gzip, a large JSON module is probably fewer bytes than an equivalently sized JS module. It's nice to get a rough sense (understanding that exact numbers are impossible) so that we know where to invest engineering effort.
Merging range purpose is to collect the biggest string that is part of a module. I haven't tested it enough but I think in most cases it should include the whole module code.
if this is true then I think this tool is close enough. I see a comment saying it combines continuous mappings of the same source - but I think mappings tend to be continuous for an entire module so that sounds good to me.
impl: https://github.com/danvk/source-map-explorer/blob/f408a40a46a07f01b7b10c7a49ca19133bfe2c40/src/explore.ts#L265-L280
My concern is that calculating the gzip size of each mapping in isolation misses out on compression savings from common patterns. I expect this means that
--gzip
overestimates the gzip size of each module. Though I'm not sure whatmergeRanges
does, perhaps it is grouping everything?(came up in https://github.com/GoogleChrome/lighthouse/issues/10476)