google / guetzli

Perceptual JPEG encoder
Apache License 2.0
12.9k stars 977 forks source link

Google Pagespeed and guetzli #206

Open DrBGM opened 7 years ago

DrBGM commented 7 years ago

Is there a way to inform "Google Pagespeed", that all images have been compressed with guetzli -q 95 already?

I still get the message, that the images can all be compressed even more (up to 50%) ? This information is questionable of course.

Would it be better to use the image version, that google pagespeed allows to download after the sitecheck (optimizeContents).

Any ideas welcome.

Best regards, Dr.BGM

DeeDeeG commented 7 years ago

Ultimately, PageSpeed Insights are just suggestions.

There are official Google pages that fail certain tests on PageSpeed. (example).

The desire for visual quality, and the desire to have small file sizes will tend to oppose each-other, and machines can't say which is best. There isn't really a best option.

That said, you can download some of the images PageSpeed lets you download and compare them with the ones you got from Guetzli. Nothing wrong with that. If you like those better, maybe they are better.

Generally, Guetzli excels at keeping visual quality high. Other encoders can be more free and aggressive in how they encode, so it's easy to get a smaller file from other encoders. If you can't see the artifacts, if color quality looks the same, it's possible another encoder is the better choice for the particular image.

(PageSpeed won't look at image quality, only size! So it takes a human eye to answer this question.)

(Maybe they will fix PageSpeed to figure out if images have already been compressed, somehow? But I don't even know if that's possible.)

DeeDeeG commented 7 years ago

So to be a little less philosophical and more practical...

I would assume PageSpeed is running your images through some kind of lossy encoder, and if your original image was some percentage larger than PageSpeed's re-encoded image, it recommends you should compress.

I notice the compressed images they gave me for one of my sites had noticeable artifacts.

This leads me to believe:

So following their compression advice is following the advice of an oversimplifying robot, and the human eye can do better. I think your best bet is to generate some encodes at a few qualities, for example --quality 95, --quality 90, --quality 85 with Guetzli and pick whichever is smallest with acceptable visual quality.

If you want to dig deeper, run some encodes between the two best quality settings (e.g. between 85 and 90, try 87 & 89,) and see if those are better.

If you want to go even deeper, you might try with another encoder and compare to see if your results are any better. (As far as I know, the most popular encoders besides Guetzli are libjpeg, and two forks of libjpeg: libjpeg-turbo, and mozjpeg. These can be used from the command line, or they are built into other programs such as ImageMagick and GIMP. If you use Photoshop, that might have its own proprietary encoder, etc. so just pick one of these and give it a try.)

thehelp1 commented 7 years ago

Its a ways away.

On May 17, 2017 1:17 AM, "DrBGM" notifications@github.com wrote:

Is there a way to inform "Google Pagespeed", that all images have been compressed with guetzli -q 95 already?

I still get the message, that the images can all be compressed even more (up to 50%) ? This information is questionable of course.

Would it be better to use the image version, that google pagespeed allows to download after the sitecheck (optimizeContents).

Any ideas welcome.

Best regards, Dr.BGM

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/google/guetzli/issues/206, or mute the thread https://github.com/notifications/unsubscribe-auth/AZNZWEY7Nq345E7jmtmE0oikTFP2gQKPks5r6p8BgaJpZM4NddrI .

FossPrime commented 7 years ago

I had an image compressed with guetzli q84 from 215.8kb to 98.6kb. The one pagespeed recommends is 86.7kb, 11.9kb smaller than the smallest guetzli will produce.

Someone at google needs to stop and tell us what they want from us. Ilya Grigorik once showed off a script that checked visual quality and compressed to that point... if he could do it, pagespeed can. Meanwhile, having guetzli go down to whatever pagespeed / google search wants... will have to do.

FossPrime commented 7 years ago

Did more research. The problem is squarely with pagespeed

Update: I've ended up just using the pagespeed recommended imagemagick command. Had to update it from 6.7, which was producing awful results... that pagespeed was okay with. Easiest thing to do is compile from source... helostore's guide using remi's repo's installs 6.7 even today. Here's what I ran in centos7

sudo yum install autoconf libpng-devel libjpeg-devel
git clone https://github.com/ImageMagick/ImageMagick.git
cd ImageMagick
VERSION=$(git tag | sort -V | tail -1)
git checkout $VERSION
./configure
make
sudo make install
sudo ldconfig /usr/local/lib
/usr/local/bin/convert -version
Version: ImageMagick 7.0.5-10 Q16 x86_64 2017-06-05 http://www.imagemagick.org

https://www.imagemagick.org/script/advanced-unix-installation.php

Update 3: PageSpeed really wants you to use RGB, not sRGB. If guetzle supported it we could do something like this and probably appease pagespeed, give chroma compression parameters aswell https://gist.github.com/Mika-/185c8d777aab3ebba548

tomByrer commented 7 years ago

The rule seems to be if libjpeg can reduce it by more than 10%, it will complain, even if a much smaller mozjpeg / guetzli file looks better

So, run libjpeg in lossless mode after guetzli, just like ImageOptim / FileOptmizer does ;) Guetzli just reduces the number of pixels. Other jpeg optimizers can squeeze out another 5-15% by reodering the data & losslessly recompressing the reordered data.

FossPrime commented 7 years ago

They use a very specific imagemagick line including using 4:2:0 chroma, strip, quality 85 and interlace jpeg which runs libjpeg with lossy settings. Might still be worth it to get real file size down and appease pagespeed.

So one would run guetzli at 100 and then run pagespeeds recommended 85 line... Out can usually get away with up to 88 and stay under 10%.

On Mon, Jun 5, 2017, 10:25 PM Tom Byrer notifications@github.com wrote:

The rule seems to be if libjpeg can reduce it by more than 10%, it will complain, even if a much smaller mozjpeg / guetzli file looks better

So, run libjpeg in lossless mode after guetzli, just like ImageOptim / FileOptmizer does ;) Guetzli just reduces the number of pixels. Other jpeg optimizers can squeeze out another 5-15% by reodering the data & losslessly recompressing the reordered data.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/google/guetzli/issues/206#issuecomment-306371361, or mute the thread https://github.com/notifications/unsubscribe-auth/AA1eLP6im6FMqy2uQXY1pMM5nDWMeX5Qks5sBMbDgaJpZM4NddrI .

--

Ray Foss

kornelski commented 7 years ago

Lossless compression (jpegrescan, MozJPEG's jpegtran (included in ImageOptim)) after Guetzli is a great idea. It's a further reduction in file size for free with no extra loss of quality.

However, lossy recompression (ImageMagick, MozJPEG's cjpeg) is likely to defeat the purpose of Guetzli. Guetzli carefully selects quantization tables and quantization coefficients. Recompression is going to throw it all away, naively degrade quality, and amplify JPEG distortions due to unavoidable rounding errors in recompression. In that case you lose quality twice - the process keeps Guetzli's quality loss, but throws away its file size reduction.

So when Guetzli can't give you file size small enough, then don't use it at all. Use the second tool on your original input file instead, so you don't lose quality twice.

DeeDeeG commented 7 years ago

I just want to say this: It doesn't seem like they would use this "compressed images" test from Insights to do anything with your "Page Rank" on google search. It's a really really oversimplified test, and it barely works, since various (roughly best-practice) jpeg compression workflows fail the test.

(what is it good for? IT DOES point out if you have some massive uncompressed JPEG that could save 60% or more on filesize or whatever. It's useful as a hint to site owners that they should compress if they haven't, but not much more useful than that, So again, I doubt they would rank your pages higher or lower based on this exact metric.)

It would be good to get some input from a Google employee on if this is part of page ranking or not.