whatwg / html

HTML Standard
https://html.spec.whatwg.org/multipage/
Other
7.99k stars 2.61k forks source link

Standard for chroma subsampling based on encode quality #5395

Open LeonScroggins opened 4 years ago

LeonScroggins commented 4 years ago

See web compat issue 32547 and crbug.com/972180 for more details and history.

When encoding to JPEG, FireFox treats a quality of 90 or above as a request to disable subsampling. Chromium, meanwhile, only disables subsampling at quality 100 (CanvasAsyncBlobCreator and ImageDataBuffer).

By turning off chroma subsampling at a lower quality, FireFox is able to create much smaller Jpegs that still have relatively good looking text (i.e. not blurry).

For compatibility, the two browsers should have the same behavior, but it's not obvious (at least to me) what the right behavior is. According to the Chromium bug, PhotoShop uses a quality of 50 or higher to turn off chroma subsampling. What do other browsers do? It would be ideal if there is a spec somewhere for how to behave.

A tangential idea is to add a new API that allows specifying whether to use chroma subsampling.

domenic commented 4 years ago

It's not clear whether we really want to make the quality argument imply specific encoding settings; to some extent it's a fuzzy dial by design. /cc @whatwg/canvas for their thoughts.

We could potentially extend the argument list for toBlob or toDataURL with various encoding settings, if implementers were interested and could agree on such a list.

micahjon commented 4 years ago

Standardization (or more consistent defaults) around chroma subsampling would make it easier to develop cross-platform webapps that export images.

Firefox turns off subsampling at 90% quality and above Chrome & Safari turn off subsampling ONLY at 100% quality

At befunky.com, users create text & icon-rich images in WebGL and export them for print or web use. We find Firefox's defaults to be the most convenient, as users can export crisp images (no subsampling) at reasonable quality settings (e.g. 95%). In contrast, in Chrome or Safari, to get a crisp image at a size suitable for web use, users must export at 100% quality and then compress their images with a different program.

To get around this, other online photo editors use WebAssembly to load their own image compression library, which comes at significant costs in performance and complexity. It would be wonderful if browser implementations were optimized for common use cases (render text on canvas, export canvas as blob), and we could ship lighter apps to our users. Really appreciate you all looking into this!

TimothyGu commented 4 years ago

I personally think this is primarily an implementation concern. Firefox and Chrome both use libjpeg (or some variant of it) for JPEG compression where you can tweak the the quality measure and the chroma subsampling method independently. Safari on macOS on the other hand uses Apple's Core Graphics API, which does not allow this.

On the other hand, I agree that it may make sense for specific implementations to either align with other implementations, or if to simply make the quality scale more linear.

For fun, I plotted the size of the output of encoding this photo to JPEG using different settings on Chrome and Firefox:

image
(blue is Chrome, orange is Firefox)

As expected, there is some discontinuity for Firefox at 0.9, while Chrome gets a large bump at 100%. I feel it makes sense for Chrome to use 4:4:4 for some non-100% qualities, to close the huge gap between 99% and 100%. But I still believe this doesn't belong in the spec itself.

Script to reproduce ```js img = new Image(); img.src = 'https://upload.wikimedia.org/wikipedia/commons/6/6d/Reichstagsgeb%C3%A4ude_und_Paul-L%C3%B6be-Haus%2C_Berlin-Mitte%2C_170327%2C_ako.jpg'; img.onload = () => { canvas = document.createElement('canvas'); document.body.appendChild(canvas); context = canvas.getContext('2d'); canvas.width = 9427; canvas.height = 5303; context.drawImage(img, 0, 0, 9427, 5303); for (let i = 85; i <= 100; i++) canvas.toBlob(b => console.log(i / 100, b.size), 'image/jpeg', i / 100); }; ```
micahjon commented 4 years ago

Thanks @TimothyGu, I agree, even using 4:4:4 for 95-99% qualities would bring Chrome much closer to a linear quality scale and to Firefox's implementation.

As far as having chroma subsampling configurable, one idea would be to add an "options" parameter to canvas.toBlob and canvas.toDataURL that would specify this and future image encoding properties in supporting browsers, similar to what was done for addEventListener, with options.capture, options.passive and options.once. https://developer.mozilla.org/en-US/docs/Web/API/EventTarget/addEventListener

Unlike addEventListener, where options replaces an existing parameter and must be polyfilled, we would have the luxury of adding an additional one that would simply be ignored by old browsers.

However, even if there was consensus on an API, it would still make sense for Chromium to choose a better default value. @LeonScroggins, any thoughts?

tomdav999 commented 3 years ago

Hi, just and end user here but I thought I'd add my two cents. A few months ago I thought I'd do something cool and write some code to resize file uploads client side before uploading to server. I found a wonderful javascript library to resize via canvas using lanczos 3 and USM. Everything seemed so promising...

2 months later, other than learning a lot in the process, this mostly ended up being a waste of time thanks to the various issues noted here. First, lack of the consistency among browsers and the mystery of what is going on under the hood. Second, the chroma sub-sampling for chrome and safari at quality < 100% which negates a significant amount of the savings from resize for most clients based on market share. Third, wasm modules (e.g. mozjpeg) not really being a solution as they run too slow or consume too much processing on some clients to effectively negate the time savings from uploading smaller file sizes. I eventually cried uncle.

I'm not sure what the answer is but at the very minimum browsers shouldn't be doing chroma sub-sampling at higher quality settings. The 100% threshold that Chrome and Safari use is insane. Firefox seems to have gotten it right with the 90% threshold and furthermore I think the mozjpeg folks were quite sensible blending things in with 2x1 sub-sampling by default below 90% quality, and 2x2 by default below 80%, and exposing chroma subsampling and other options for the end user to override. The latter would be ideal but arguably not the realm of the standards groups.