Closed kate-sann5100 closed 2 years ago
Thanks, I've added this to the image registration milestone.
After searching, I have found the following mutual information implementations by established repositories:
skimage.metrics.normalized_mutual_information
)ants.image_mutual_information
) (source code)itkJointHistogramMutualInformationImageToImageMetric
itkNormalizedMutualInformationHistogramImageToImageMetric
itkMattesMutualInformationImageToImageMetric
)The most similar implementation to ours is itkMattesMutualInformationImageToImageMetric
, while ants.image_mutual_information
is its python wrapper.
The main difference is that while Mattes mutual information adopts B-Spline kernel to calculate the weight each voxel adds to respective bins, our implementation(similar to DeepReg and VoxelMorph) adopts Gaussian probability.
The following steps are planned:
ants.image_mutual_information
. This could be achieved either by getting the same result when using the same input or show changes in same directions as the input changes. sigma
parameter.Current and future progress will be uploaded to this pull request https://github.com/Project-MONAI/MONAI/pull/2847
@wyli @YipengHu
monai.transforms.Affined(translate_params=(i, i, i))
) at different value of translate param i
by our and ANTsPY's implementation.I
increaseslooks cool, after the verifications, if we can reproduce the blue line in the unit test, it'll be a nice mile stone, what do you think?
looks cool, after the verifications, if we can reproduce the blue line in the unit test, it'll be a nice mile stone, what do you think?
Agreed.
I have also visualised the trending of MI against rotation
Moreover, here are the results under the two scenarios suggested by @YipengHu
thanks, is this based on the latest implementation? I've updated the rounding option https://github.com/Project-MONAI/MONAI/pull/2847/commits/e5c6db061ab4b8c26f4209621fb0f99e1129df05 to make the b-spline option differentiable (see the test https://github.com/kate-sann5100/MONAI/blob/e5c6db061ab4b8c26f4209621fb0f99e1129df05/tests/test_reg_loss_integration.py#L39).
I have updated a better written result in my forked MONAI (to avoid adding new dependencies to this main repository)
The unmentioned results are achieved by the current code in pull request 3196, where a bug fix has been made.
I will add the benchmarking results into the unit-testing within this week in the same pull request if you are happy with it.
Advices are welcomed on how to modify the structure and writing of the benchmarking report.
@wyli @YipengHu
Thanks, it looks great, I put the testing data in the project share drive here: https://drive.google.com/uc?id=17tsDLvG_GZm7a4fCVMCv-KyDx0hqq1ji
, you can follow this approach to set up the tests with a data downloading step: https://github.com/Project-MONAI/MONAI/blob/7dc364cdf5524fd6a2d24d2abc63894651ce4c61/tests/test_patch_wsi_dataset.py#L109-L110
The markdown documentation could go to the tutorial repo, perhaps the modules
folder https://github.com/Project-MONAI/tutorials/tree/master/modules.
@kate-sann5100 @YipengHu I think we can merge this part and have a chat next week?
Fine with me, although the equations do not show up well on my dark theme.
Is your feature request related to a problem? Please describe. The current implementation of GlobalMutualInformationLoss can only apply Parzen windowing with bins distributed evenly between 0 and 1 instead of customise bins according to the input distribution. This lead to two problems: 1) The inputs (
target
andpred
) must range between 0 and 1 2) It is hard to benchmark this loss against the implementation by other packages (e.g antspyx)Describe the solution you'd like Reimplement GlobalMutualInformationLoss.
Describe alternatives you've considered N/A
Additional context N/A