lovell / icc

JavaScript module to parse International Color Consortium (ICC) profiles
Apache License 2.0
57 stars 6 forks source link

Add transform table parsing and function with tests #7

Open MathieuLoutre opened 1 year ago

MathieuLoutre commented 1 year ago

Partial PR as the A2B tests don't pass.

This is picking up the work from #3 and parses mft1 and mft2 tags (B2A, A2B and gamt). It also creates transform functions for each of the tables by taking an input array (expecting normalised values between 0 and 1) and returning an another array (with values normalised between 0 and 1).

E.g: profile.B2A0.transform([0, 0, 0]) returns [1, 0, 0, 0] for the Probev1 profile.

For testing the tables are excluded from the equality test as discussed in #2 and the transforms are used to check the output based on the probe profile.

Right now, the A2B0 transform is the only one that doesn't behave according to spec and makes the test fail. It returns 0.61 instead of a value between 0.7 and 1 as expected.

lovell commented 1 year ago

Thanks for the PR, I see what you mean about the perceptual A2B tests producing an unexpected value (0.61...). Having taken a very quick look at the code, my best guess would be that the intermediate clutValueForOutput array is somehow incorrect but I don't have any time right now to debug this further.

MathieuLoutre commented 10 months ago

Hi @lovell, I'm picking this up again. Did some more investigation and used the official ICC Profile Inspector to check the transforms.

An A2B0 transform of [1, 1, 1, 1] gives:

{ inputColor: [ 65535, 65535, 65535, 65535 ] }
{ inputColorForClut: [ 4, 4, 4, 4 ] }
{ clutValue: [ 13102, 32669, 33236 ] }

And the official tool finds the same values:

An A2B0 transform of [0, 0, 0, 0] gives:

{ inputColor: [ 0, 0, 0, 0 ] }
{ inputColorForClut: [ 0, 0, 0, 0 ] }
{ clutValue: [ 65279, 32768, 32767 ] }

Which is also supported by the tool:

So it seems you're right, it's something after that is not correct. The clutValueForOutput for both of those is [ 0, 0, 0 ]. Since the first output only has 2 values ([ 40000, 65535 ]) and that 40000/65535 is ~0.61, that would mean that to pass the test (>= 0.7 && <= 1) the first value of clutValueForOutput would need to be 1.

I'm not sure how that's expected given that 13102 seems to be right value according to the tool and 13102/65535 is only ~0.19 and 65279/65535 is ~0.99. Worth noting, based on the docs mentioning need to round values I tried it instead of floor and it fixed only one of the two cases (the one yielding 0.99) and broke another on A2B2.

Another option would be for the output array to be wrong and it shouldn't be 40000 but at least 45900 as the first value. However, it seems that reading with readUInt16BE is correct based on the docs so I'm not sure...

Do you have any insight? Is there something really obvious I've overlooked?