ibezkrovnyi / image-quantization

Image Quantization Library with alpha support (based on https://github.com/leeoniya/RgbQuant.js, https://github.com/timoxley/neuquant and http://www.ece.mcmaster.ca/~xwu/cq.c)
140 stars 11 forks source link

Ошибка в калькуляторах расстояний #4

Open Nommyde opened 8 years ago

Nommyde commented 8 years ago

В ManhattanSRGB и EuclideanRgb... методах r, g, b умножаются на коэффициенты RED = .2126, GREEN = .7152, BLUE = .0722

В этом нет смысла, так как эти коэффициенты применимы к линейным CIE RGB, а не к sRGB (в котором все изображения в вебе). Появится смысл, если эти коэффициенты перевести в sRGB, тогда они примут вид: RED = 0.4984, GREEN = 0.8625, BLUE = 0.2979

Либо если не умножать компоненты на коэффициенты, а просто перевести пиксель из sRGB в CIE RGB и там уже мерить евклидово расстояние (но в таком случае потеряется смысл с точки зрения человеческого восприятия)

Также эти коэффициенты используются в методе getLuminocity, это тоже неверно. Там лучше взять формулу Luminance (perceived option 1): (0.299*R + 0.587*G + 0.114*B) [2]

Еще точнее будет просто взять координату Y из CIE XYZ, но это долго вычисляется

leeoniya commented 8 years ago

Igor probably got those numbers from the original RgbQuant source [1]. I got those coefficients from Rec 709 [2] for RGB.

@Nommyde, where are you getting your coefficients? I could not find them anywhere. Do they look more accurate in your testing? Also, if sRGB is non-linear, then how would different (but still static) coefficients be any more accurate?

[1] https://github.com/leeoniya/RgbQuant.js/blob/master/src/rgbquant.js#L703 [2] https://en.wikipedia.org/wiki/Rec._709

Nommyde commented 8 years ago

@leeoniya i get these coefficients by applying srgb conversion formula (see Csrgb).

Do they look more accurate in your testing?

Yes. And if you scale rgb levels with these coefficients and look at same levels (black rectangle) srgb colors will have equivalent lightness. But if you scale with Rec.709 coef, lightness will different.

if sRGB is non-linear, then how would different (but still static) coefficients be any more accurate?

Non-linear in energy, but approximately linear for human perception, so it can be scaled linearly

Nommyde commented 8 years ago

@leeoniya Rec.709 coeffs applied: srgb2

Red and blue are lighter than green in same points.

If coeff. will too small (like linear 0.0722), small difference in corresponding coordinate will make big color change, but same difference in other coordinate will make small color change.

leeoniya commented 8 years ago

@Nommyde

Interesting, thanks!

I'll play around with the adjusted coefficients in in RgbQuant as well and see how things look. Though RqbQuant uses a component-scaled Eucledian distance [1] which got better results than a simple scaled sum.

[1] http://alienryderflex.com/hsp.html

Nommyde commented 8 years ago

@leeoniya in your project luma and distance are calculated with square of components (looks like fast average of gamma correction) so Rec.709 coeffs are more suitable. But this project has no gamma correction

Nommyde commented 8 years ago

@igor-bezkrovny кстати, ты тоже можешь попробовать вместо изменения коэффициентов применить быструю гамма-коррекцию, возведя в квадрат компоненты, как у @leeoniya

leeoniya commented 8 years ago

oh, i see.

The Manhattan option [1] doesnt use squares though, so maybe it's worth using the adjusted coefficients for that.

[1] https://github.com/leeoniya/RgbQuant.js/blob/master/src/rgbquant.js#L733

Nommyde commented 8 years ago

may be, try them :)

leeoniya commented 8 years ago

The quality that image-quantization gets with Wu v2 w/alpha + Riemersma is just astounding. IMO nothing beats it in any of the test cases I have tried. And it gets these results with no tweaking needed. The perf could use a bit of work...perhaps with asm.js or WebAssembly or even better by offloading to WebGL shaders & Web Workers.

btw, until i poke my server. working demos are here: http://leeoniya.github.io/RgbQuant.js/demo/

Nommyde commented 8 years ago

nothing beats it

@leeoniya, среди методов этого проекта или вообще среди известных тебе?

leeoniya commented 8 years ago

Between all that i've seen before giving up :) Including various combinations of [1] [2].

Maybe there is something better, but I would have a hard time determining if it was actually better or just "different".

[1] http://www.codeproject.com/Articles/66341/A-Simple-Yet-Quite-Powerful-Palette-Quantizer-in-C [2] http://bisqwit.iki.fi/story/howto/dither/jy/

Nommyde commented 8 years ago

http://igor-bezkrovny.github.io/image-q/demo/0.1.4/index.html тут я могу выбрать эту лучшую комбинацию? или это устаревшая демка?

leeoniya commented 8 years ago

quant_combo

Nommyde commented 8 years ago

Я думаю при малом количестве цветов лучше всех работает scolorq Оригинал:

image

Wu: 4 colors

image

scolorq: 4 colors

image

http://bisqwit.iki.fi/jutut/colorquant/index4.html http://www.cs.berkeley.edu/~dcoetzee/downloads/scolorq/ http://www.ximagic.com/q_results_kodim04_16.html

leeoniya commented 8 years ago

In this case, yes.

When the color counts are very low, the variation is huge not only between libraries but also between individual samples. At this point you're just choosing what you personally prefer for your specific workload, since every answer is very wrong, just along different axes, lol.

I think for >= 16 colors Wu + Riemersma is very hard to beat over a diverse workload. Of course I could be wrong :)

Nommyde commented 8 years ago

Даже при 16 цветах мне кажется scolorq точнее :) Но я не тестировал его производительность, подозреваю что очень долго работает

wu - original - scolorq image

leeoniya commented 8 years ago

не спорю :)

leeoniya commented 8 years ago

However, the demo also doesn't have a Wu version with component coefficients which often have a large impact and may fix that shift towards red in the wu version.

ibezkrovnyi commented 8 years ago

Regarding original post:

There is an opinion sRGB luminosity should be calculated as follows:

  1. in Russian
  2. more verbose, in English
Name Red Green Blue
NTSC RGB 0.298839 0.586811 0.114350
CIE RGB 0.176204 0.812985 0.010811
sRGB 0.212656 0.715158 0.072186

Coefficients Luminance (perceived option 1): (0.299*R + 0.587*G + 0.114*B) [2] are from standard of 1953 for NTSC/Phosphor displays - see this

So, it seems that best coefficients for Y of sRGB images are RED = .2126, GREEN = .7152, BLUE = .0722.

@Nommyde @leeoniya

Nommyde commented 8 years ago

@igor-bezkrovny anyway calculating Luma as linear combination of non-linear sRGB components (ignoring gamma) is bad idea if you need high precision. It is just approximation. And i was wrong, when i say that bt.601 coeffs are better, but bt.709 coeffs are not best too. For example, if you compare Y calculated with bt.709 coeffs and Y calculated with bt.601 coeffs with more accurate L* component you will see approximately same errors. But bt.601 works better with dark colors, and bt.701 with light colors. here is code https://jsfiddle.net/Nommyde/3Lxe10pp/ So there are no ideal coeffs for calculating Y this way.

But in Manhattan algorithm i think my coeffs will be better:) try them on real pictures

ibezkrovnyi commented 8 years ago

For example, if you compare Y calculated with bt.709 coeffs and Y calculated with bt.601 coeffs with more accurate L* component you will see approximately same errors. But bt.601 works better with dark colors, and bt.701 with light colors. here is code https://jsfiddle.net/Nommyde/3Lxe10pp/ So there are no ideal coeffs for calculating Y this way.

Yes, but the best standard now is bt.709, and the only standard for sRGB.

But in Manhattan algorithm i think my coeffs will be better:) try them on real pictures

I temporarily added ManhattanNommyde color distance method here: http://igor-bezkrovny.github.io/image-q/demo/1.0.1/index.html. It uses coefficients 0.4984 * R + 0.8625 * G + 0.2979 * B. You can play with different images comparing it with ManhattanSRGB and other color distance methods.

However:

Nommyde commented 8 years ago

@igor-bezkrovny in manhattan algorithm sum of coeffs must not be 1.0 and not applicable in this script (https://jsfiddle.net/Nommyde/3Lxe10pp/). because in simple manhattan sum of coefs is 3.0 :) and it is not a problem. but if you want, you can scale them (divide by 1.6588), result will not change.

simple manhattan:

image

nommyde manhattan:

image

sRGB manhattan (looks wrong):

image