holoviz-topics / imagen

ImaGen: Generic Python library for 0D, 1D and 2D pattern distributions
https://imagen.holoviz.org/
BSD 3-Clause "New" or "Revised" License
31 stars 16 forks source link

Odd colour display of images when scale>1.0 #38

Open jlstevens opened 9 years ago

jlstevens commented 9 years ago

One issue with the recent addition of colour display of Imagen pattern generators is that sometimes odd colour artifacts appear. I have traced this down to issues with scale values greater than 1.0. For instance:

FileImage(aspect_ratio=1.0,scale=2.0, 
          filename='./images/mcgill/foliage_a_combined/01.png')[:]

The issue is that the RGBA SheetViews expect values between 0.0 and 1.0. When the scale is increased, eventually some channel will get clipped into this range.

Although the fix is easy for GenericImage it isn't so clear what to do with other classes that use multiple channels. What is the best thing to do for ComposeChannels and composite types where the maximum scale value is a function of multiple pattern generators/inputs?

jbednar commented 9 years ago

This is a tricky issue. In terms of our simulations, I don't think that having RGB or LMS values above 1.0 is actually a problem; I don't think there's any clipping or overflow that would happen inside our actual models (I hope!).

Right now, it must be being clipped eventually by matplotlib for display, and I'm guessing that it's doing such clipping as RGB, not HSV, leading to the observed artifacts. I.e. if we have an orange color like (R=255,G=128,B=0), if we scale it up by 2.0 it will be (R=510,G=256,B=0), which turns into bright yellow when clipped to one byte in RGB (R=255,G=255,B=0).

If it's just about the display, then I think we have several options:

  1. We could just leave it, because it tells us very clearly when clipping is happening, which we might want to know.
  2. We could clip any out-of-range pixels in HSV space, which just means normalizing them by their highest value: (R=510,G=256,B=0)*(256/510) = (R=256,G=128,B=0). Areas with clipping should then look washed out (like an overexposed photo), but otherwise normal.
  3. We could normalize the entire image for display purposes -- if any pixel is out of range, renormalize to make the brightest pixel 255 (1.0 in our native format). Pictures will then look normal, but the scale will be misleading, because it will appear dimmer than it really was (compared to other images not exceeding the threshold).

If all this is correct, then I probably favor either option 1 or 2, with 2 probably better (in the sense of not alarming people when there isn't really a problem) but 1 being much less work.