mozilla / mozjpeg

Improved JPEG encoder.
Other
5.44k stars 417 forks source link

Add the capabilities of imgmin for optimizing images. #12

Open 133794m3r opened 10 years ago

133794m3r commented 10 years ago

Imgmin is a lossy optimize but it does it by making the image appear no worse so it's a really good system. Also just optimizing the huffman trees I doubt will do a whole of extra work.

https://github.com/rflynn/imgmin

That is the link to his project where it goes through attempting to optimize the jpegs. If you've not already, I'd try to look at 7zip's deflate encoder and see if their huffman encoder is better than what most everyone else does with it. I know that their system can be beat every-so-slightly in full deflate streams when optimizing the huffman trees but still since this is trying to be the single best system out there I would like for you to look at that system, and also looking at imgmin to encode jpegs at a lossy level(not by default but an option) so that way the files are going to be greatly smaller whilst not losing any visual quality.

P.S. I imagine this is going to be used on the web/for websites so I see no reason to not add the option to do the lossy optimizations from imgmin in the file step/a non-standard option.

pengvado commented 10 years ago

7zip is a irrelevant. Unlike deflate, jpeg doesn't allow you to switch huffman codebooks in mid-stream. And as long as you only get one codebook, there's a simple algorithm that gives the exact optimal huffman codes, and everyone already uses it.

133794m3r commented 10 years ago

ah ok I didn't know that as I've not looked at jpeg in full detail.

kowpa1990 commented 10 years ago

Support lossy/aggressive optimizations

JPEGs sometimes are stored with unnecessarily high quality, and there are tools like adept and JPEGmini that try to recompress JPEGs at lowest possible quality.

In my opinion, you should introduce the lossy compression of images. I recently checked jpeg-recompress and works perfectly. I mean implement a solution that would enter no visible change on first sight in the appearance of the image.

Other interesting projects of this type:

dwbuiten commented 10 years ago

Also maybe of interest: JPEGmini's metric (BBCQ) is detailed in their SPIE 2011 paper here. It's basically geometrically weighted PSNR, local variance, and edge detection on 8x8 block voundaries, over tiles / a window with a few hacks based on the max luma value. I implemented their PSNR and AAE metrics here, but not the local variance (I used a linear weighting because I was lazy, and matlab made deriving it easy).

They do other fancy things like mucking with deadzones and stuff too.

As for Adept's ROI coding, it seems nifty, but dangerous. e.g. on flat images like cartoons or anime it could go very, very wrong.

kornelski commented 10 years ago

@dwbuiten Unfortunately I can't check it out, as the link to vimeows.com github requires login, and the linear weighting link seems to link to the same page.

dwbuiten commented 10 years ago

@pornel I edited my post now to link to the correct URL now. I have been having issues with the latest firefox updating the urlbar properly, and I keep messing up copy/paste of URLs.

gunta commented 10 years ago

+1 for jpeg-recompress / JPEGmini like algorithms. I think this feature is the most important feature to add, since it will make more than 10% compression in the end.

Manual tuning of compression quality should be left for very special cases only.

danielgtaylor commented 10 years ago

Author of jpeg-recompress here. I plan to evaluate and switch to using libmozjpeg for encoding in the near future. The problem that I see with adding this functionality to the mozjpeg encoder by default is that it requires many encoding passes, so you see your image encoding times increase by e.g. 5x. That may not be the best default behavior.

That said, if you plan to add it to mozjpeg, let me know how I can help.

dwbuiten commented 10 years ago

@danielgtaylor Most of that slowness is from the entropy coding optimization. Disabling that can probably speed it up quite a bit, while still using trellis.

danielgtaylor commented 10 years ago

I've added support for libmozjpeg to jpeg-recompress in https://github.com/danielgtaylor/jpeg-archive/commit/aa9abe16918fc5421d982b0ccdbb9b2c3f2eeffa and the results so far are promising. Running jpeg-recompress now takes about twice as long as it did before, with only the final step using all optimizations (far better than taking 5x as long). It may be possible to reduce this further. Initial results with my small test data set:

Folder Size (MB) Compression Ratio Time (seconds)
test-files 16.8 100% -
test-output-libjpegturbo 5.8 35% 16.7
test-output-libmozjpeg 4.7 28% 27.4

That's an average 7% extra reduction in file size at the same perceived visual quality (using -q medium which sets an SSIM target of 0.9999)

gunta commented 10 years ago

@danielgtaylor Sounds nice!