Open JamesCallanan opened 2 years ago
I found the issue. It is due to the resizing of the cam variable using skimage.transform.resize().
from skimage.transform import resize capi=resize(cam,(128,128,128))
This resizing implementation results in cam[ : , : , 0 ] equalling cam[ : , : , 1 ] and cam[ : , : , -1 ] equalling cam[ : , : , -2] for me.
Swapping the resizing function which relies on scipy.ndimage.zoom()) worked for me. Now each slice now is unique.
def resize_volume(img, desired_depth, desired_height, desired_width): """Resize across z-axis"""
current_depth = img.shape[-1]
current_width = img.shape[0]
current_height = img.shape[1]
# Compute depth factor
depth = current_depth / desired_depth
width = current_width / desired_width
height = current_height / desired_height
depth_factor = 1 / depth
width_factor = 1 / width
height_factor = 1 / height
# Rotate
img = ndimage.rotate(img, 90, reshape=False)
# Resize across z-axis
img = ndimage.zoom(img, (width_factor, height_factor, depth_factor), order=1)
return img
capi = resize_volume(cam,128,128,128)
hello @JamesCallanan @fitushar please let me know how much memory is required for it? as I am getting OOM memory exhaustion error. I have 24GB GPUs and 128GB RAM
Hi,
I'm looking to apply Grad-CAM to a 3D CNN classifier that I have trained. This CNN takes input volumes of shape (250,250,6). I have applied the same approach as outlined in Grad-CAM.ipynb to this network.
A heatmap of the correct shape is returned. However, I'm finding that only 4 of those heatmap slices are unique.
i.e.
heatmap[ : , : , 0 ] == heatmap[ : , : , 1 ]
and
heatmap[ : , : , 4 ] == heatmap[ : , : , 5 ]
I was wondering if you had come across this before or have any idea what could be going on.
Thank you, James.