ANTsX / ANTsPy

A fast medical imaging analysis library in Python with algorithms for registration, segmentation, and more.
https://antspyx.readthedocs.io
Apache License 2.0
625 stars 161 forks source link

Smudging in ants registration #191

Closed anorak94 closed 4 years ago

anorak94 commented 4 years ago

I have been using ants for registering cerebellum slices to a reference atlas. for most parts it works correctly but sometimes i get a weird smudging in my registered images

Screenshot 2020-06-25 at 02 02 34 Screenshot 2020-06-25 at 02 02 14 Screenshot 2020-06-25 at 02 01 48

this is my ants call

    fixed_contour = ants.from_numpy(rssm_contour)
    moving_contour  = ants.from_numpy(dapi_contour)

    reg_contour = [cntr_metric, fixed_contour, moving_contour, weight, 64, "Regular", 0.85 ]

    metrics = list()
    metrics.append(reg_contour)
    imft1 = rss
    immt2 = ants.from_numpy(match_histograms(immt1.numpy(), imft1.numpy()))
    tx   = ants.registration(fixed=imft1, moving=immt2, type_of_transform ="SyN", metric ="MattesMutualInformation",
                             reg_iterations = (500, 500, 500, 500, 500), 
                                 grad_step = 0.25, aff_iterations=(5000, 5000, 5000, 5000, 5000), 
                                 aff_shrink_factors=(4, 2, 2, 1, 1), 
                                 aff_smoothing_sigmas=(3, 2, 1, 0, 0),

                             multivariate_extras = metrics)
    mywarpedimage = ants.apply_transforms( fixed=imft1, moving=immt2,
                                           transformlist=tx['fwdtransforms'] )

is there a way to mitigate this smudging effect.

The images in which it works well

Screenshot 2020-06-25 at 02 01 10 Screenshot 2020-06-25 at 02 07 08
stnava commented 4 years ago

sounds like a nice project. do you have data that you can share ie a reproducible example that illustrates this?

stnava commented 4 years ago

also I can't see any smudging but that may be because I don't know what you mean by the term.

https://en.wikipedia.org/wiki/Smudging

anorak94 commented 4 years ago

sorry i mean the green from the image has been smudged all over the place instead of only being confined to the boundaries of the cerebellum where it should be

anorak94 commented 4 years ago

i will share the data

anorak94 commented 4 years ago

https://we.tl/t-88ZFVRXWGe heres the data and the notebook

also on a related note when the microscopy images are flipped i.e. in the opposite left to right orientation to the microscopy images given here ants doesnt flip them around maybe i will share that example later

stnava commented 4 years ago

can you provide a cleaner and tighter example. I'd have to rewrite this one to figure out what is going and, even then, it's hard to know what you mean by smudging. need a very specific and very clear example.

anorak94 commented 4 years ago
Screenshot 2020-06-25 at 15 07 08

i ll do that. I have simplified the notebook

i am doing a multimetric registration i am registering two images and the registration is being guided by the contours that i have extracted from the fixed and moving image

By smudging i mean that in some images you see that there is a lot of green all over the image while in the first image registration is very bad there is just a green streak across the image

also in some cases registration fails

https://we.tl/t-FVZ7pDzKop

anorak94 commented 4 years ago
Screenshot 2020-06-25 at 17 30 07 Screenshot 2020-06-25 at 17 29 52

i have this green color inside the yellow contour drawn in the image and the registration algorithm has failed as well since the image does not occupy the entire fixed image. i think its because of the smearing of the green color outside the moving image boundary which probably decreases the image dissimilarity measure artificially

is it clear now or should i provide some more clarifications

stnava commented 4 years ago

it is not clear enough to me. I don't have time to unpack your code so that it could be (first) understood and (2nd) debugged. it's also not at all clear what your data is supposed to represent and what your issue is. that being said, I can run your example but I don't have the 2 hours it would take to rewrite it etc. ideally you would provide :

  1. a clear explanation of what the data is and what it's supposed to represent + demonstrate that it's not just garbage in

  2. a step by step script (not a function) that shows the problem

  3. explanation of what you think the issue might be

I may eventually look at the ipynb but who knows how long that could take.

anorak94 commented 4 years ago

ref ref_c dapi_c dapi


dapi = ants.image_read("dapi.png")
ref = ants.image_read("ref.png")
dapi_c  =ants.image_read("dapi_c.png")
ref_c = ants.image_read("ref_c.png")

reg_contour = ["MattesMutualInformation", ref_c, dapi_c, 500, 64, "Regular", 0.85 ]
metrics = list()
metrics.append(reg_contour)

## the actual registration 

immt2 = ants.from_numpy(match_histograms(dapi.numpy(), ref.numpy()))  ## moving image after histogram matching
tx   = ants.registration(fixed=ref, moving=immt2, type_of_transform = "SyN", metric = "MattesMutualInformation",
                         reg_iterations = (500, 500, 500, 500, 500), 
                             grad_step = 0.25, aff_iterations=(5000, 5000, 5000, 5000, 5000), 
                             aff_shrink_factors=(4, 2, 2, 1, 1), 
                             aff_smoothing_sigmas=(3, 2, 1, 0, 0),
                                multivariate_extras = metrics)

mywarpedimage = ants.apply_transforms( fixed=ref, moving=immt2,
                                       transformlist=tx['fwdtransforms'] )
## run the registration

## plot the results ##0 - is the fixed image, 1 is the moving image, 2 is the transformed image
f, a = plt.subplots(1, 3, figsize=(12, 12))
a[0].imshow(ref.numpy())
a[1].imshow(dapi.numpy())
a[2].imshow(mywarpedimage.numpy())
plt.show()
## overlay the transformed image on the reference image 
ref.plot(overlay=mywarpedimage, overlay_alpha = 0.3)
anorak94 commented 4 years ago

i have put the code snippet and the images my problem is that the registration quality is not good the fixed image is the cerebellum cropped from a reference atlas at P7 and the moving image is a slice collected experimentally dapi_c and ref_c are the contours from the cerebellum and experimental image which i extract using another algorithm i am passing those as contraints to guide the registration because i want the folds to line up to get a more geometrically correct result

when i run the same thing multiple times sometimes i get good results and sometimes not the registration result is not reproducible across different runs with the same parameter settings is there any way to fix that

anorak94 commented 4 years ago

https://github.com/ANTsX/ANTs/wiki/antsRegistration-reproducibility-issues

stnava commented 4 years ago

yes - this is an improved example. I can look at this.

regarding reproducibility : see help( ants.registration ) and the information regarding random seed

stnava commented 4 years ago

one issue with the above is I don't have the png images.

anorak94 commented 4 years ago

https://we.tl/t-13eAFd802a

stnava commented 4 years ago

can you try the following:

  1. convert all your data from png to nrrd or nifti ( on disk )
img = ants.image_read( "filename.png")
ants.image_write( img, "filename.nrrd")
  1. rerun your script with the nrrd data as input

and let me know if you see anything different.

anorak94 commented 4 years ago

the same problem persists

Screenshot 2020-06-26 at 01 24 22
dapi = ants.image_read("dapi.nrrd")
ref = ants.image_read("ref.nrrd")
dapi_c  =ants.image_read("dapi_c.nrrd")
ref_c = ants.image_read("ref_c.nrrd")

reg_contour = ["MattesMutualInformation", ref_c, dapi_c, 500, 64, "Regular", 0.85 ]
metrics = list()
metrics.append(reg_contour)

## the actual registration 

immt2 = ants.from_numpy(match_histograms(dapi.numpy(), ref.numpy()))  ## moving image after histogram matching
tx   = ants.registration(fixed=ref, moving=immt2, type_of_transform = "SyN", metric = "MattesMutualInformation",
                         reg_iterations = (500, 500, 500, 500, 500), 
                             grad_step = 0.25, aff_iterations=(5000, 5000, 5000, 5000, 5000), 
                             aff_shrink_factors=(4, 2, 2, 1, 1), 
                             aff_smoothing_sigmas=(3, 2, 1, 0, 0),
                                multivariate_extras = metrics)

mywarpedimage = ants.apply_transforms( fixed=ref, moving=immt2,
                                       transformlist=tx['fwdtransforms'] )
## run the registration

## plot the results ##0 - is the fixed image, 1 is the moving image, 2 is the transformed image
f, a = plt.subplots(1, 3, figsize=(12, 12))
a[0].imshow(ref.numpy())
a[1].imshow(dapi.numpy())
a[2].imshow(mywarpedimage.numpy())
plt.show()
## overlay the transformed image on the reference image 
ref.plot(overlay=mywarpedimage, overlay_alpha = 0.3)
anorak94 commented 4 years ago
Screenshot 2020-06-26 at 01 25 36

second run

anorak94 commented 4 years ago
Screenshot 2020-06-26 at 01 27 19

fifth run is better but this kind of problem makes it harder to have ants as a part of a high throughput fully automated pipeline because then somebody needs to go and rerun the images that failed

stnava commented 4 years ago

I get highly repeatable and reasonable results with this code. let me know if you see the same.

import ants
from skimage.exposure import match_histograms
import matplotlib.pyplot as plt
import numpy as np
from skimage.color import rgb2gray
import pickle
from scipy.spatial import ConvexHull
from scipy import ndimage
from PIL import Image, ImageDraw
from scipy.ndimage import gaussian_filter
refB=ants.image_read( '~/Downloads/ref.nrrd' )
immt2B=ants.image_read( '~/Downloads/immt2.nrrd' )
tx  = ants.registration(
  fixed=refB,
  moving=immt2B,
  type_of_transform = "Affine",
  aff_iterations=(200,200,200,200,0,0,0),
  aff_smoothing_sigmas=(16,8,4,2,1,1,1),
  aff_shrink_factors=(16,8,4,2,1,1,1) )

txSyN   = ants.registration(
  fixed=refB,
  moving=immt2B,
  initial_transform = tx['fwdtransforms'][0],
  total_sigma = 3.0,
  flow_sigma = 5.0,
  syn_sampling = 2,
  syn_metric = 'CC',
  reg_iterations = (40,20,10),
  type_of_transform = "SyN", verbose = True )

mywarpedimage = ants.apply_transforms( fixed=refB, moving=immt2B,
                                       transformlist=txSyN['fwdtransforms'] )

## plot the results ##0 - is the fixed image, 1 is the moving image, 2 is the transformed image
f, a = plt.subplots(1, 3, figsize=(12, 12))
a[0].imshow(refB.numpy())
a[1].imshow(immt2B.numpy())
a[2].imshow(mywarpedimage.numpy())
plt.show()
## overlay the transformed image on the reference image
refB.plot(overlay=mywarpedimage, overlay_alpha = 0.3)

temp.zip

stnava commented 4 years ago

I agree that naive users have a hard time building reliable code. I doubt that has anything to do with ants, though.

anorak94 commented 4 years ago

yeah thats true i wouldnt call myself an expert on building highly reliable and scalable code and ants is wonderful i really enjoy using it sorry if i was too careless with my words

anorak94 commented 4 years ago

could you tell me how you go about choosing parameters such as aff_iterations , aff_smoothing_sigmas, aff_shrink_factors i kind of know what they are but choosing those parameter values seem kind of arbitrary to me and even in the syn call. The only step i was missing was the intial transform and also how does flow_sigma and total_sigma influence the registration process because i see that it is smoothing for the total field and the smoothing for the update field but how does this manifest itself in the registration process

stnava commented 4 years ago

there are two fields of computer vision that relate to what we do in ants ( parameters etc )

https://www.kth.se/profile/tony/page/scale-space-theory

and

https://en.wikipedia.org/wiki/Pattern_theory

in addition to basic ideas from optimization.

anorak94 commented 4 years ago

and also how does flow_sigma and total_sigma influence the registration process because i see that it is smoothing for the total field and the smoothing for the update field but how does this manifest itself in the registration process

stnava commented 4 years ago

you can run experiments to answer those questions with detail for your data. it's too complex to give a simple answer here as it is a function of all possible input images / matching problems etc.