napari / napari

napari: a fast, interactive, multi-dimensional image viewer for python
https://napari.org
BSD 3-Clause "New" or "Revised" License
2.22k stars 423 forks source link

add built-in colormaps with transparency #3575

Open chili-chiu opened 3 years ago

chili-chiu commented 3 years ago

@cgohlke

🚀 Feature

Napari offers a "translucent" volume rendering mode but there are no longer any built-in colormaps with transparency. Some built-in colormaps with transparency will be very helpful for work with that rendering mode.

Alternatives

Adding an interactive transfer function editor so users can adjust transparency settings will be even more useful.

alisterburt commented 3 years ago

Love the idea - do you have any experience with what these colormaps tend to look like in other softwares? I've thought about this before, particularly when rendering volumes with dark backgrounds on our white canvas in the light theme but wasn't immediately sure how to tackle it

chili-chiu commented 3 years ago

@alisterburt The current behavior of having 2 or more image layers with transparency settings resulted in the mix of color.

Red image layer + blue image layer of binary blobs, each with alpha = 0.5 for blobs and alpha = 0 for background: image The overlap shows up as the mixed color (purple).

The more desired behavior - at least for fluorescent microscopy images - is to keep the same color scheme and displays the color of the pixel that is closest to the view that has alpha value > 0: image

This behavior has the advantage of conveying signals' relative positions and maintaining the ability to distinguish each channel when many colors may be involved.

kevinyamauchi commented 3 years ago

Thanks for making this issue, @chili-chiu and for the examples! From my understanding, the behavior you would like to add is:

Did I understand correctly?

For the transparent background, one option is to add a shader filter that "discards" (i.e., does not render) pixels that are below the threshold value. In this way, the "transparent background" behavior can be added for any colormap without having to create new colormaps. I made a prototype in this plugin. I am sorry, but it might be broken, as I made it as an experiment some time ago and haven't really revisited it. If you are interested, I could try to fix it and show it at the next Americas/Euro/Africa community meeting.

As a related aside, I think it would be nice to expose a way to easily attach/detach vispy visual filters. See the discussion in this twitter thread: https://twitter.com/martweig/status/1460643240588238849

For the rendering, this is a bit trickier, as we need to be able to figure out what is "in front". For isosurface rendering, I think this is clear, as we know the position of the surface being rendered. I could imagine taking a similar approach for max/min intensity projections (use the position max/min was found at. I am less certain what to do for average. Adding "depth" to our 3D image rendering is something @alisterburt and I have been wanting to add and we would be happy to discuss if you have some ideas or comments!

alisterburt commented 3 years ago

@kevinyamauchi this issue came about after @chili-chiu showed me the 'blending' mode in imaris. In this mode, the user plays with a widget which defines a mapping between intensity value and alpha value. This is similar to your plugin but rather than one threshold you define...

We ran a quick experiment to replicate this functionality and @chili-chiu noted that when we set the alpha value to 0.5 in red and blue channels of the same intensity we end up with the purple you see in the overlaps here.

I think the question is: if we have two channels with different alpha values and different z heights at the same pixel, can we choose to render only that which is closest to the canvas and have it appear with the correct RGBA for the channel which is closest? i.e. maintain the colour and alpha, rather than blend with the colour and alpha from the other layer

I think this might be what happens with our opaque blending mode? really not sure though, would need to run some tests

kevinyamauchi commented 3 years ago

Thanks for the clarification, @alisterburt .

I think the question is: if we have two channels with different alpha values and different z heights at the same pixel, can we choose to render only that which is closest to the canvas and have it appear with the correct RGBA for the channel which is closest? i.e. maintain the colour and alpha, rather than blend with the colour and alpha from the other layer

I believe this is what would happen if we properly set the depth buffer for each image (I think for both translucent and opaque blending modes). For each pixel, only the image closest to the screen would get rendered. However, the depth buffer is set by all of the visuals that have depth turned on, so they would potentially be other visuals "fighting". In #3398 , I introduced a translucent blending mode that doesn't depth check, so the user could set the layer they didn't want to have depth to that blending mode.

In case it is helpful, here are the gl settings that the different blending modes correspond with: https://github.com/vispy/vispy/blob/99d04d20902aed96aeea13aec8e44979668112d4/vispy/gloo/wrappers.py#L37-L52

Jose-Verdu-Diaz commented 2 years ago

I found a workaround by defining custom colormaps, modifying this code.

Create a linspace cmap from white (1, 1, 1, 1]) to blue (0, 0, 1, 1):

colors = np.linspace(
    start=[1, 1, 1, 1],
    stop=[0, 0, 1, 1],
    num=10,
    endpoint=True
)

Select first value and change alpha value:

colors[0] = np.array([1., 1., 1., 0])

Define new colormap:

new_colormap = {
  'colors': colors,
  'name': 'white_to_blue',
  'interpolation': 'linear'
}

Add an image into napari with custom colormap:

viewer = napari.Viewer()
viewer.add_image(img, colormap = new_colormap)

The ideal result (in my particular application) is that only pixel values equal to 0 are transparent, but I'm not sure if this workaround makes other non-0 pixels transparent... I will play with this solution for a while, if I make it work as intended I'll update on this issue.

alisterburt commented 2 years ago

Thank you for sharing @Jose-Verdu-Diaz ! That's really nice and I'm glad it works for your application

Kevin has added depth buffer support to the vispy VolumeVisual so we really aren't far away from 'composable' raycasting through multiple volumes - this just needs to be enabled/setup properly for volume rendering without interfering with depth setting for the plane rendering

I don't need this myself so will likely not get to it soon but if anyone would like to work on pushing this forward I would be keen to help/pair on the problem, don't hesitate to reach out

alisterburt commented 2 years ago

I started pushing on the depth side of this a bit in vispy/vispy#2305 - colormap side should be easy in a small napari plugin :)