facelessuser / coloraide

A library to aid in using colors
https://facelessuser.github.io/coloraide
MIT License
195 stars 12 forks source link

CCT related support #321

Closed facelessuser closed 1 year ago

facelessuser commented 1 year ago

Not quite sure how this will look, but I imagine we'd like at the very least a method to create a color from a given temperature in Kelvin and a way to retrieve an approximate temperature for a given color.

There are many different ways to calculate to and from CCT. Some are good for certain ranges, some are more accurate than others, and some don't convert as easily in the reverse direction.

facelessuser commented 1 year ago

Generating the Planckian Locus from the standard observer, we can generate colors on the black body curve. Colors are currently returned as XYZ values using the calculated x, y chromaticities, normalizing them and then using a Y of 1. This means that the colors are all very bright.

In order to get the colors that other implementations return, you need to take the max of the RGB values and normalize by dividing all the values by that max. Afterward, we can clip any values that fall outside the gamut. Not sure if this should be included in the function, or let the user do this step...

Screenshot 2023-05-28 at 9 23 39 AM
facelessuser commented 1 year ago

I was able to get the Robertson 1968 to work locally. The idea was to use it to get the temperature of a given color. The one problem with it is it is that it starts to break down when you get lower than ~1666.7K. Ideally, we'd like to target a range from 1000K+. I imagine that if a method had to be capped at some high end we could, we don't have to go as high as 100000K, but it would be nice if we could, even if accuracy does drop some.

The problem is that each method has some strengths and some weaknesses. Robertson has reliability only in the range of 1666.7+ though I imagine you could still go as low as 1500 and still be good enough. Some are only good for 1000 - 25000, or something like that. Some don't really expose getting the DUV, which I'd really like to use as it helps you to rule out some colors that are not close enough to the locus to be considered accurate. Some require some more complex approximation to get the inverse result. I'm not sure if we need the inverse as we have a way already to generate the locus using Planck, but it may be nice to have it reversible. Probably all methods suffer from varying degrees of errors, so selecting one that is good enough is probably fine.

I've seen the Tanner Helland approach that uses a spline to calculate an easy way to test RGB colors, but you don't get the DUV, and it really isn't trying to match a color on the isotherms, it just bisects temperatures on the spline to see which one the color is closest to which can give some non-ideal results. It's probably not a bad way to generate the black body curve though, but we can do that fine with using Planck.

Robertson might be a fine way to start. We also don't need to implement every possible approach under the sun. We aren't trying to compete with things like Colour Science.

One possible approach, if not now, maybe later, is to use the method in Yoshi Ohno's "Practical Use and Calculation of CCT and Duv". The required table could be cached, and it has a range from 1000+ being most accurate around 1000 - 20000. It seems to employ two different approaches, a triangular solver for values close to the locus, and then a parabolic solver further out.

I'm not sure about speed implications in any of these approaches currently, so we'll have to see about practical use cases as well. It also may be that practical applications may not require a DUV, so if that is the case, we could consider alternatives we weren't already considering.

facelessuser commented 1 year ago

I did get the Ohno approach working. It allows us to get all the way down to 1000 forward and backward. We can also utilize DUV. I haven't tested the speed and such, but I have enough to start evaluating.

facelessuser commented 1 year ago

Keep in mind the following statements are for correlating a uv pair to a temperature, not a temperature to a uv pair.

As expected, the Ohno method is slower but provides a good range and decent accuracy (depending on the number of data points). Robertson method is much faster, but we can have larger errors in some ranges and is not great further below ~1666.7.

The Ohno paper suggests using a table of 303 data points for reasonable accuracy, but you have to store all those data points in memory. For better accuracy, you'd have to store a lot more. We are currently doing a more dynamic approach which may slow things down a bit. We take X samples across the entire range (1000 - 10000) which essentially does a low-resolution test. By finding the lowest value in those samples and taking its closest neighbors as the new range, we iterate and take another set of samples, essentially a higher resolution test in the new region. We can usually get good results in about 5 iterations.

Assuming 10 samples for each iteration, we can test about 50 iterations instead of testing a full 303 each time, but we are currently recalculating each sample as opposed to using a pre-calculated list. We could definitely use caching in the approach to speed this up. I'm not sure we'd like to store the full set of data for very accurate results. It also depends on the application in which it is being used. Being slower may not be a big deal depending on how it is being used.

facelessuser commented 1 year ago

I've been evaluating the Ohno method further.

In order to get better speed without storing a tremendous amount of data, we can actually create a spline that is based on a more reasonable amount of data. We can use this to estimate our initial 3 points for the triangle method, and then calculate more accurate values before we actually apply the triangle method. This actually gives us pretty much the same accuracy at a speed that is orders of magnitude better than calculating them on the fly.

I think this is a good compromise. It is not quite as fast as having a fully pre-generated table at our disposal, but more memory efficient with the same accuracy. We could additionally cache queries to reduce time in some cases. you would only see significant improvements if you happen to be testing multiple colors in the same region. We will likely not cache anything initially.

facelessuser commented 1 year ago

I think we have enough that we just need to settle on an API and decide if/how we want to expose normalizing the color returned from temp -> color conversion into the visible spectrum.

It seems a decent way to normalize is to do so in a linear space and then transform back into the gamma corrected RGB space. The following seems like a suitable way to normalize the color:

def constrain_rgb(color):
    coords = color.coords()
    # Amount of white needed
    w = -min(0, *coords)

    # Add enough white to make RGB all positive.
    if w > 0:
        coords = [c + w for c in coords]

    # Normalize all channels by the largest
    m = max(coords)
    color[:-1] = [c / m for c in coords] if m else coords
    return color
facelessuser commented 1 year ago

I think our API will consist of a classmethod that allows you to generate colors along the black body curve called blackbody. Colors will be normalized to be within some specified linear RGB space (default sRGB Linear). After normalization, the colors will be in the visible spectrum, the colors being a visual approximation that may not necessarily match the CCT closely. If the normalization space is set to None, the colors may not be in the visible spectrum, but they should correlate well with the CCT. ∆uv can also be used to specify a color some distance away from the locus along the related isotherm. The accuracy of these colors may drop significantly if temperature and/or ∆uv are past the practical limits of the selected algorithm.

Also, we will have an instance method called cct that can get the correlated color temperature for a given color, both temperature and ∆uv. If the color is too far from the locus, the values may be less accurate/inaccurate.

Right now we have two algorithms that we will implement. Ohno 2013 which allows us to get decent accuracy from 1000 - 100000 and the widely used Robertson 1968. There are quite a few approaches out there, but we'll likely stick to these two. Robertson is a bit faster than Ohno and well but not quite accurate as Ohno (based on the specified table) and limited to a lower range of 1667. Ohno is a bit slower (though we've got to run pretty fast) and allows a range from 1000 - 100000 (we could expand this range if we wanted, but this is a very practical range).

I think we will run with the larger range (Ohno) by default. As shown below, we can properly generate both the black body diagram and isotherms.

blackbody