Closed cells2numbers closed 1 year ago
FWIW, I ran into this as well and my take on it was that it's caused by the Keras MaxPooling layer (with 2x2 pool) ignoring the last, singleton column in images which means that anytime either the height or width dimensions isn't even along the downsampling path in the UNet, a pixel is lost. That then seems to make the upsampled features at whatever level had an odd dimension after pooling incompatible since (2x2) upsampled features always have even height/width dimensions.
Since the unet4nuclei network has 3 MaxPooling/UpSampling layers, I've been working around this by resizing my images to have dimensions that are all evenly divisible by 2 at least 3 times. Maybe there's a more elegant way to do that though with single pixel cropping/zero-padding in the network ¯\(ツ)/¯
Thanks for pointing this out and explaining the error, Eric! I used a similar workaround for my images but did not have the time to integrate it into the plugin. Do you have some sample code at hand?
I was using the network outside of the context of CellProfiler but was working with logic like this if it helps:
from skimage import transform
import numpy as np
def unet_shape_resize(shape, n_pooling_layers):
"""Resize shape for compatibility with UNet architecture
Args:
shape: Shape of images to be resized in format HW[D1, D2, ...] where any
trailing dimensions after the first two are ignored
n_pooling_layers: Number of pooling (or upsampling) layers in network
Returns:
Shape with HW sizes transformed to nearest value acceptable by network
"""
base = 2**n_pooling_layers
rcsh = np.round(np.array(shape[:2]) / base).astype(int)
# Combine HW axes transformation with trailing shape dimensions
# (being careful not to return 0-length axes)
return tuple(base * np.clip(rcsh, 1, None)) + tuple(shape[2:])
def unet_image_resize(image, n_pooling_layers):
"""Resize image for compatibility with UNet architecture
Args:
image: Image to be resized in format HW[D1, D2, ...] where any
trailing dimensions after the first two are ignored
n_pooling_layers: Number of pooling (or upsampling) layers in network
Returns:
Image with HW dimensions resized to nearest value acceptable by network
"""
shape = unet_shape_resize(image.shape, n_pooling_layers)
# Note here that the type and range of the image will either not change
# or become float64, 0-1 (which makes no difference w/ subsequent min/max scaling)
return image if shape == image.shape else transform.resize(
image, shape, mode='reflect', anti_aliasing=True)
unet_image_resize(np.ones((520,695)), 3).shape # --> (520, 696)
unet_image_resize(np.ones((2,35)), 3).shape # --> (8, 32)
unet_image_resize(np.ones((520,695,3,4,5)), 3).shape # --> (520, 696, 3, 4, 5)
Looking at the code though, it seems like the shape resize could apply pretty seamlessly in unet_initialize
and the image resize in unet_classify
.
Thanks for sharing the code, Eric! I agree the functions fit in quite well. I'll try to integrate them
@eric-czech I made a little stand-alone python package for this which incorporates your changes suggested in this issue. Hope that is ok with you. (see https://github.com/VolkerH/unet-nuclei)
Certainly! Thanks for putting up that repo @VolkerH -- looks very helpful for future versions of ourselves wanting to give this model a spin.
This is probably not the place for this discussion but maybe @cells2numbers will have some advice on how to integrate CP modules/plugins into external pipelines without all the GUI-related dependencies. I find myself often lifting code out of this repo or the main CP repo simply because my efforts to use the modules directly from code (ideally in a Python 3 environment) haven't gone well. Is there a better way to go about that? Or more specifically, are these two items on the roadmap likely to drop any time soon?
Does that commit allow us to close this one (and #70?)
At the time I created the pull request it did fix both issues #65 and #70. I haven't tested it for a while and don't have a good setup for testing it at the moment.
requirements.txt
and requirements-windows.txt
file. So intstalling dependcies via pip install requirements.txt
may at some point pull an incompatible version of keras again. My pull request should work with keras 2.2.2
. Might make sense to pin it down to >= 2.2
.I am closing this issue because it refers to a plugin that we are not currently maintaining.
ClassifyPixels-Unet fails to process an image of size 695x520. The same image with a size of 696x520 works fine.
Error message:
Attached two images to reproduce the error (one image fails, one works) + the used pipeline
unet_test_data.zip