rmaia / pavo

tools for the analysis of color data in R
http://pavo.colrverse.com
GNU General Public License v2.0
69 stars 17 forks source link

Qcatch (or colorspace location) ---> reflectance #109

Open thomased opened 6 years ago

thomased commented 6 years ago

Don't know how plausible it is, but it'd be cool to be able to go from locations in colorspace to RGB (or whatever's appropriate for human printing/screens), for designing behavioural stimuli. E.g. If I want to present an animal with some colourful things 1, 2, 4, and 6 JNDs from some background to test discrimination, what should I print?

rmaia commented 6 years ago

Maybe the method in "Modeling the object-color solid and the number of discriminable colors" would be a starting point? https://elifesciences.org/articles/15675

Maybe a simpler solution could be based on the Pike chromaticity diagram with preserved distances. In that transformed space, you'd just need to pick a point from a sphere centered in the background and with radius in xJND. Not sure how you'd convert those coordinates back into RGB but should be possible!

thomased commented 6 years ago

Yeah, interesting. The pikespace idea sounds like a great place to start — that'd simplify things. I can just about imagine a solution that goes pikespace -> cie -> rgb. I think...

Bisaloo commented 5 years ago

Okay, so I've been thinking about it since I stumbled upon Pike's paper again a couple of days ago.

I think this problem does not have a solution. I hope I'll manage to explain clearly in writing.

pavo already provides a way to convert spectra to colourspace location, given a visual system. Let's call v this function. As we know, v is definetely not injective. For example, hummingbirds and parrots are both able to produce green hues (in human vision) but the spectra are very different (single peak for hummingbirds / yellow + blue for parrots).

there exist spec1 and spec2, so that v(spec1) = v(spec2)

It's possible to mathematically derive a spectrum from a given arbitrary colourspace location. I.e., if I'm provided with a visual system and coordinates p in the colourspace, I know of a function b which outputs one solution b(p) = spec_sol so that v(spec_sol) = p.

But there actually exist many solutions (an infinity?) for this equation, because as mentioned above, v is no injective.

But maybe that's not an issue? So let's see how it goes if we run this idea. Let's say we want to do something similar to Pike's paper: print butterfly wings whose colour has been chosen to map to an arbitrary location in the avian visual system.

We know how to do

p --(b)--> spec_sol --(v)--> RGB

But then, this RGB stimulus has to be displayed to the experiment subject, by either printing it or displaying it on a screen. Even though we encoded this as RGB, it will actually be a spectrum, because that's what all light information is.

So what's actually happening is

p --(b)--> spec_sol --(v)--> RGB --(printer)--> spec_print

And we have no idea whether spec_print is the same as spec_sol. In other words, we can't be sure that spec_print, once fed to v will output p, which was the initial goal.

In theory, there might be ways to overcome this issue but I'm not sure how realistic they are. For example, maybe there exist a printer or screen that displays a given spectrum (instead of a given RGB stimulus).

So that's my thoughts for now, maybe we'll think of something else at some point.

thomased commented 4 years ago

Yeah, agreed. Thinking this over again, I'm going to move the goalposts to just estimating a reflectance spectrum from qcatches (or colourspace location, same diff). qcatch --> reflectance, so b in your above. Yes there are probably a ton of solutions (maybe infinite in a technical sense), but natural spectra are fairly smooth & predictable so you should be able to recover a strong guess. Some kind of polynomial fit to your qcatch data would get you close I'd think.

Apart from just being cool, it could also be still useful for things like stimulus design etc. in a rough-and-ready way. Might be of use to camera-based people too, since a calibrated setup can return qcatches from any pixel which you could then estimate reflectances from, thereby turning any camera into a cheap hyperspec of sorts. I'm pretty sure Jolyon mentioned he'd successfully toyed with the idea, it was just a bit of a pain to implement in imageJ/java.

Bisaloo commented 4 years ago

Here what I did a couple of years ago: I was trying to design a spectrum (bh_spec2) that had the same quantum catches / colourspace location as a known spectrum (bh_spec1):

library(pavo)
#> Welcome to pavo 2! Take a look at the latest features (and update your bibliography) in our recent publication: Maia R., Gruson H., Endler J. A., White T. E. (2019) pavo 2: new tools for the spectral and spatial analysis of colour in R. Methods in Ecology and Evolution, 10, 1097-1107.

ciexyz = as.matrix(sensdata("cie10"))
inved  = MASS::ginv(ciexyz)
wl = 300:700
bh_spec1 = 120*exp(-0.1 * (wl-500)) / (1+ exp(-0.1 * (wl-500)))**2 +
  120*exp(-0.1 * (wl-600)) / (1+ exp(-0.1 * (wl-600)))**2
bh_spec2 = t(bh_spec1 %*% ciexyz %*% inved)
bh_spec = data.frame("wl" = wl,
                     spec1 = bh_spec1,
                     spec2 = bh_spec2)
bh_spec = as.rspec(bh_spec, lim = c(300,700))
#> wavelengths found in column 1

plot(bh_spec)


plot(colspace(vismodel(bh_spec, "cie10")))
#> Warning: cie system chosen, overriding incompatible parameters.

Created on 2020-01-21 by the reprex package (v0.3.0)

It works quite well in this case but there issue is that I have no way to specify that I want a smooth, non-negative function so it doesn't always work:

library(pavo)
#> Welcome to pavo 2! Take a look at the latest features (and update your bibliography) in our recent publication: Maia R., Gruson H., Endler J. A., White T. E. (2019) pavo 2: new tools for the spectral and spatial analysis of colour in R. Methods in Ecology and Evolution, 10, 1097-1107.

bluetit = as.matrix(sensdata("bluetit"))
bluetit_inved  = MASS::ginv(bluetit)
wl = 300:700
hb_spec1 = 120*exp(-0.1 * (wl-510)) / (1+ exp(-0.1 * (wl-510)))**2
hb_spec2 = t(hb_spec1 %*% bluetit %*% bluetit_inved)
hb_spec = data.frame("wl" = wl,
                     spec1 = hb_spec1,
                     spec2 = hb_spec2)

plot(as.rspec(hb_spec))
#> wavelengths found in column 1
#> The spectral data contain 92 negative value(s), which may produce unexpected results if used in models. Consider using procspec() to correct them.

Created on 2020-01-21 by the reprex package (v0.3.0)

Bisaloo commented 4 years ago

https://cran.r-project.org/web/packages/colorSpec/vignettes/inversion.pdf

Seems like this problem has been cracked. I need to have a look at the refs whenever I get a moment.

thomased commented 4 years ago

Oh cool, will do the same. That looks great.