shssoichiro / oxipng

Multithreaded PNG optimizer written in Rust
MIT License
2.95k stars 125 forks source link

Expose zc level 0 (uncompressed) πŸš€ #641

Closed Galaxy4594 closed 2 months ago

Galaxy4594 commented 2 months ago

Allow the creation of PNGs with uncompressed deflate streams via level 0 of libdeflate. If you want a glorified BMP with delta filters, this change will make your dreams come true πŸ™‚.

goodusername123 commented 2 months ago

Thanks, this seems harmless enough πŸ™‚ Curious why you would use this though?

The possible use cases are niche but do exist, such as storing PNGs with compression methods other then deflate but in a way that doesn't break compatibility with pre-existing decoders. An example being to convert pre-existing PNGs to be uncompressed but with the delta filter strategies present in oxipng then compressing said PNGs inside of a container or filesystem that has compression methods with better ratios then deflate (such as brotli).

andrews05 commented 2 months ago

Hm, just be aware that if the compression level is capped at 0, the filter evaluations (and some reductions) will all end up tied for which is best. E.g. if you run it with all filters -f 0-9 it will likely just end up choosing the first result every time, i.e. "None". You could pick just one, e.g. Brute, but that won't always be optimal for the external compressor.

Galaxy4594 commented 2 months ago

Instead of trying every possible filter, I use a simple heuristic where I choose the same filter for deflate. I run oxipng -o max -P -vv input.png and select the filter that results in the smallest file size. Then, I run oxipng --zc 0 -f x --force input.png --out output.png && brotli -j -v output.png. With some simple scripting, I can generate PNGs compressed with Brotli/Zstd that are roughly 10-50% smaller. 😁

andrews05 commented 2 months ago

Yeah, that could work. You could also drop the -P on the first run and then use --nx on the second, so it retains the best reductions from the first.

[edit] How does it compare to just using a newer format like JXL or (shudder) WebP?

Galaxy4594 commented 2 months ago

How does it compare to just using a newer format like JXL or (shudder) WebP?

Not good, JXL and WebP compress about 10-20% more efficiently than Brotli/Zstd PNGs. For 16-bit content, JXL achieves around 50% better compression. For animated images, the improvement varies by 30-50%. Maybe compressing multiple images in a .tar would achieve better results? It probably won't make much of a difference.

While talking to a friend, we came up with a crazy idea to save bandwidth by any means necessary: serve uncompressed PNGs, which are then compressed via a CDN/web server and decompressed in the browser (assuming you can’t use WebP/JXL). πŸ˜…

ace-dent commented 2 months ago

@Galaxy4594 - I was experimenting with this a few years back. The theory is sound: (1) you have a container that has a better compression algorithm; and / or (2) you can exploit duplicated information across multiple images for higher compression. As you know for web, depending on server setup, this can happen at the transport layer using gzip and now brotli. For speed this may be disabled for png assets server side, but I can imagine embedded (base64) images in an html page may work out ...?

My interest is more in packaged pngs, often in zip archives, within a software distribution. Recently I analysed a few hundred pngs that are LZMA packed for the KolibriOS project. Contrary to theory (and as I've seen before too), ~99% of images do benefit from high level of png (/deflate) compression, and then being compressed in an archive (even deflating a pre-deflated stream!).

@andrews05 - I think expected behaviour here is good filter choices but just uncompressed in final output. Would it make sense to do the evaluations with some level of deflate compression? Or is it undesirable / unintuitive to override the User's choices in this way? Either way, it's pretty niche- and perhaps should be documented as such?

Galaxy4594 commented 2 months ago

Recently I analysed a few hundred pngs that are LZMA packed for the KolibriOS project. Contrary to theory (and as I've seen before too), ~99% of images do benefit from high level of png (/deflate) compression, and then being compressed in an archive (even deflating a pre-deflated stream!).

I did my own tests I came to the opposite conclusion, uncompressed PNGs compressed better than regular deflate PNGs. I even tested on the data set you mentioned (KolibriOS) and these are the results I got. These were compressed at 7z level 9 with word size 273.

KolibriOS-uncompress-PNGs.7z ------- 44,165,338 bytes
KolibriOS-deflate-PNGs.7z  --------- 44,074,109 bytes
KolibriOS-uncompress-brute-PNGs.7z - 43,950,764 bytes

It may seem like recompressing deflate PNGs are better, but applying the right filter makes a difference. No manual setting of filters per image, I just applied the brute filter to every PNG, 😁.

ace-dent commented 2 months ago

@Galaxy4594 - thanks for sharing these interesting results. I should have been a bit more specific... KolibriOS uses its own packer (originally derived from 7z, but with some differences). I will certainly investigate again in the future πŸ‘