Closed dstufft closed 8 years ago
Hey, @dstufft! Would the threshold
option work for you? It lets you specify a minimum file size required for compressing.
@jstuckey So it's not really a function of file size, but rather the makeup of the original file. Here's an example from one of my OSS projects:
-rw-r--r-- 1 dstufft staff 982 May 2 11:31 warehouse/static/dist/images/blue-cube-small.288811c8.png
-rw-r--r-- 1 dstufft staff 1005 May 2 11:31 warehouse/static/dist/images/blue-cube-small.288811c8.png.gz
As you can see, applying gzip to this actually made the file size larger (by 23 bytes, but that 23 bytes can add up for oft requested files). However, I have smaller files where gzip was able to compress it a lot and reduced the file size by half or more.
I think it would be useful to have a dedicated option for this, ideally with the ability to configure a "savings threshold", so people can either set it to a boolean (true
?) and have it do a basic if (compressed_file_size >= original_file_size) { skipFile(); }
and perhaps take an integer from 1 to 100 that represents a % of the original filesize that the reduction in file size must clear before it's considered worth it to have two representations of the file (which will need to be cached twice instead of once). Then projects can pick whether say, a 1 byte savings is worth it to cache something twice or if they only want to compress things that can reduce file size by at least 10% or something.
Does all of that make sense?
Yeah, that makes sense. I'll take a look.
Merged #21 and published as version 1.4.0. https://www.npmjs.com/package/gulp-gzip
Thanks!
No problem!
Compressing a file typically makes it smaller, but sometimes it does nothing or even makes it larger. It would be great if this library could gain an option to only write out files where compressing actually made it smaller. This would eliminate the chance that compression will make things worse (either with a larger file size, or by having 2 cache entries instead of one).