ANTsX / ANTsPy

A fast medical imaging analysis library in Python with algorithms for registration, segmentation, and more.
https://antspyx.readthedocs.io
Apache License 2.0
629 stars 161 forks source link

Unexpected values with label transformations #574

Closed FIrgolitsch closed 6 months ago

FIrgolitsch commented 6 months ago

Describe the bug I'm trying to align the Allen Institute mouse brain annotation volume to our own volumes using the following sequence: Register the Allen template to our own, transform 1 (Similarity + SyN) Register our own 3D brain acquisition to our own template, transform 2 (Similarity + SyN) After this, I use the transformations to transform the Allen annotation volume to our own acquisition image space. So I apply transform 1 and the inverse transform 2 to the Allen annotation volume to obtain the final aligned annotations. For applying the transforms I use the genericLabel interpolator. This mostly works fine except for a couple of regions. These regions now have values which were not present in the original annotation volume. This is a problem for us with further analysis and image generation as I can't map the right colour to these values, as these values don't exist in the Allen Institute structure trees. I suspect that it is summing label values somewhere, but I'm not sure as to the cause of the problem.

I can share the volumes if required.

To reproduce

rigid_transform_template = ants.registration(fixed=lsfm_template, moving=template_allen, moving_mask=None,
                                             type_of_transform='Similarity', write_composite_transform=True)
syn_transform_template = ants.registration(fixed=lsfm_template, moving=template_allen, moving_mask=None,
                                           type_of_transform='SyN',
                                           initial_transform=rigid_transform_template['fwdtransforms'],
                                           write_composite_transform=True)

rigid_transform_image = ants.registration(fixed=lsfm_template, moving=image, moving_mask=temp_mask,
                                          type_of_transform='Similarity',
                                          write_composite_transform=True)
syn_transform_image = ants.registration(fixed=lsfm_template, moving=image, moving_mask=temp_mask,
                                        type_of_transform='SyN',
                                        initial_transform=rigid_transform_image['fwdtransforms'],
                                        write_composite_transform=True)

new_atlas = ants.apply_transforms(fixed=lsfm_template, moving=atlas_allen,
                                  transformlist=syn_transform_template['fwdtransforms'],
                                  interpolator="genericLabel")
new_atlas = ants.apply_transforms(fixed=image, moving=new_atlas,
                                  transformlist=syn_transform_image['invtransforms'],
                                  interpolator="genericLabel")

Expected behavior There should be no integer values in the new image that were not in the original volume. The interpolator should assign only existing label values.

ntustison commented 6 months ago

Just curious---what specific integers get changed?

FIrgolitsch commented 6 months ago

There some regions of the atlas that seem to get merged. Their Ids get around 2x the value but not quite. It are some layers in the cortex.

ntustison commented 6 months ago

What are the actual values?

Also, note that the culpable code is an ITK function class so you might want to also ask over at the ITK discussion forum.

FIrgolitsch commented 6 months ago

I'm expecting 5 layers in the registered part: 312782550 312782554 312782558 312782562 312782566

I'm getting only one value: 312782560

gdevenyi commented 6 months ago

This may be issues with bit type for the saved file.

What's the header info for the input and output files?

FIrgolitsch commented 6 months ago

For the atlas:

ANTsImage (RAS)
     Pixel Type : unsigned int (uint32)
     Components : 1
     Dimensions : (456, 528, 320)
     Spacing    : (0.025, 0.025, 0.025)
     Origin     : (0.0, 0.0, 0.0)
     Direction  : [ 1.  0.  0.  0.  1.  0.  0.  0. -1.]

For the image:

ANTsImage (RSP)
     Pixel Type : unsigned int (uint32)
     Components : 1
     Dimensions : (512, 512, 515)
     Spacing    : (0.026, 0.026, 0.026)
     Origin     : (0.0, 0.0, 0.0)
     Direction  : [ 1.  0.  0.  0.  0. -1.  0. -1.  0.]

For our own template:

ANTsImage (RAS)
     Pixel Type : float (float32)
     Components : 1
     Dimensions : (456, 528, 320)
     Spacing    : (0.025, 0.025, 0.025)
     Origin     : (0.0, 0.0, 0.0)
     Direction  : [ 1.  0.  0.  0.  1.  0.  0.  0. -1.]

For the Allen template:

ANTsImage (RAS)
     Pixel Type : unsigned int (uint32)
     Components : 1
     Dimensions : (456, 528, 320)
     Spacing    : (0.025, 0.025, 0.025)
     Origin     : (0.0, 0.0, 0.0)
     Direction  : [ 1.  0.  0.  0.  1.  0.  0.  0. -1.]

For the registered atlas, the output file:

ANTsImage (RSP)
     Pixel Type : unsigned int (uint32)
     Components : 1
     Dimensions : (512, 512, 515)
     Spacing    : (0.026, 0.026, 0.026)
     Origin     : (0.0, 0.0, 0.0)
     Direction  : [ 1.  0.  0.  0.  0. -1.  0. -1.  0.]
gdevenyi commented 6 months ago

@ntustison am I correct that the data will make a round trip through float/double inside the resampling?

@FIrgolitsch what is the distribution of values in the input label set?

ntustison commented 6 months ago

I don't know. I've never dug into that code but we might have to or at least punt to the ITK discussion forum.

But here are the unique values of the Allen regional labels I have:

>>> np.array(df['Label'])
array([        1,         2,         6,         7,         9,        10,
              12,        15,        17,        19,        20,        23,
              26,        27,        28,        30,        33,        35,
              36,        38,        41,        42,        50,        52,
              54,        56,        58,        59,        62,        63,
              64,        66,        67,        68,        72,        74,
              75,        78,        81,        83,        84,        88,
              91,        93,        96,        97,        98,       100,
             101,       102,       105,       106,       108,       113,
             114,       115,       117,       118,       120,       121,
             122,       123,       125,       126,       128,       129,
             131,       132,       133,       136,       139,       140,
             143,       145,       146,       147,       148,       149,
             153,       155,       156,       158,       159,       162,
             163,       164,       169,       171,       173,       177,
             178,       180,       181,       186,       187,       188,
             189,       190,       194,       196,       197,       198,
             201,       202,       203,       204,       206,       207,
             209,       210,       211,       214,       215,       217,
             218,       222,       223,       225,       226,       229,
             230,       231,       233,       234,       237,       238,
             243,       246,       249,       250,       251,       252,
             255,       257,       258,       260,       262,       263,
             266,       268,       269,       271,       272,       274,
             279,       280,       281,       286,       287,       288,
             289,       292,       296,       298,       301,       303,
             304,       305,       307,       310,       311,       313,
             314,       318,       320,       321,       325,       326,
             327,       328,       330,       332,       333,       334,
             335,       336,       338,       342,       344,       347,
             349,       350,       351,       354,       355,       356,
             358,       362,       363,       364,       366,       368,
             372,       374,       377,       380,       381,       382,
             390,       393,       397,       401,       403,       412,
             413,       414,       421,       422,       423,       427,
             428,       429,       430,       433,       434,       436,
             437,       440,       441,       442,       443,       445,
             448,       449,       450,       451,       452,       456,
             460,       461,       463,       466,       469,       470,
             477,       478,       482,       483,       484,       488,
             501,       502,       506,       507,       510,       512,
             515,       520,       523,       525,       526,       527,
             530,       531,       534,       538,       540,       542,
             543,       544,       549,       551,       553,       556,
             558,       559,       564,       565,       566,       573,
             574,       575,       576,       577,       579,       580,
             581,       582,       583,       587,       588,       590,
             591,       593,       595,       596,       597,       598,
             599,       600,       601,       603,       604,       605,
             608,       609,       610,       611,       612,       613,
             614,       616,       620,       621,       622,       625,
             628,       629,       630,       632,       633,       634,
             638,       639,       642,       643,       648,       649,
             651,       653,       654,       655,       656,       657,
             658,       661,       662,       663,       664,       665,
             667,       670,       671,       672,       673,       675,
             678,       679,       680,       681,       685,       687,
             689,       690,       692,       693,       694,       696,
             697,       698,       699,       702,       703,       704,
             706,       707,       711,       718,       721,       725,
             727,       728,       729,       732,       733,       735,
             741,       743,       744,       749,       750,       753,
             754,       755,       757,       759,       763,       765,
             767,       771,       772,       773,       774,       778,
             780,       781,       783,       784,       786,       788,
             791,       794,       795,       797,       798,       800,
             802,       803,       804,       805,       806,       810,
             811,       812,       814,       816,       819,       820,
             821,       827,       828,       830,       831,       832,
             834,       836,       838,       839,       841,       842,
             843,       844,       846,       847,       848,       849,
             850,       851,       852,       854,       857,       859,
             862,       863,       866,       867,       869,       872,
             873,       874,       878,       880,       882,       884,
             888,       889,       893,       897,       898,       900,
             902,       903,       905,       906,       907,       908,
             910,       911,       912,       914,       916,       919,
             924,       927,       929,       930,       931,       935,
             936,       939,       940,       943,       944,       945,
             946,       949,       950,       951,       952,       954,
             955,       956,       957,       959,       961,       962,
             963,       964,       965,       966,       968,       969,
             970,       971,       973,       974,       975,       976,
             977,       978,       980,       981,       982,       984,
             986,       988,       989,       990,       996,       997,
             998,      1004,      1005,      1006,      1007,      1009,
            1010,      1015,      1016,      1020,      1021,      1022,
            1023,      1025,      1026,      1029,      1030,      1031,
            1033,      1035,      1037,      1038,      1039,      1041,
            1043,      1044,      1045,      1046,      1047,      1048,
            1049,      1051,      1052,      1054,      1056,      1058,
            1060,      1061,      1062,      1064,      1066,      1069,
            1070,      1072,      1074,      1077,      1079,      1081,
            1084,      1085,      1086,      1088,      1089,      1090,
            1091,      1092,      1093,      1094,      1096,      1097,
            1098,      1101,      1102,      1104,      1105,      1106,
            1107,      1108,      1109,      1111,      1113,      1114,
            1116,      1120,      1121,      1123,      1125,      1126,
            1127,      1128,      1139,     10671,     10703,     10704,
       182305696, 182305712, 312782560, 312782592, 312782624, 312782656,
       484682464, 484682496, 484682528, 496345664, 526157184, 526322272,
       527696992, 549009216, 560581568, 563807424, 576073728, 589508416,
       589508480, 599626944, 606826624, 606826688, 607344832, 614454272])
cookpa commented 6 months ago

The moving image is cloned as float32 inside apply_transforms.py - it will be float before it gets to the ITK filter

cookpa commented 6 months ago

Does it work correctly with NearestNeighbor interpolation?

FIrgolitsch commented 6 months ago

Nearest Neighbour has the same problem. I actually tried that first before switching to thee generic label one.

cookpa commented 6 months ago

Thanks, that suggests the label information is being lost before the resampling stage.

Do you have a link or can you share the label image? It seems you have different values to the ones @ntustison posted.

ntustison commented 6 months ago

I was thinking the same thing. Those regions are tiny and I suspect could easily get lost in the deformable transformation and reference image space.

cookpa commented 6 months ago

Absolutely, I can see labels getting lost, but I don't think new regions should be introduced. That suggests a loss of precision somewhere in up the chain.

It might have to do with numpy, it has its own primitive types, and can behave differently to the built-in int / float types.

FIrgolitsch commented 6 months ago

Here is a link to the output volume: Link

cookpa commented 6 months ago

Thanks, may I also have the input labels? I will try to track down where it changes

FIrgolitsch commented 6 months ago

I have a OneDrive link here: Link

Or you can download them directly from Allen here:

from allensdk.core.reference_space_cache import ReferenceSpaceCache

temp_dir = ""
rsc = ReferenceSpaceCache(25, reference_space_key="annotation/ccf_2017")
annotation_volume, _ = rsc.get_annotation_volume(f"{temp_dir}/annotation_25.nrrd")
FIrgolitsch commented 6 months ago

@ntustison What resolution and version of the CCF did you use for those values? They don't quite match mine. These are the ones I have in my annotation volume. It's the 25-micron CCF2017 version.

[        0         1         2         6         7         9        10
        12        15        17        19        20        23        26
        27        28        30        33        35        36        38
        41        42        50        52        54        56        58
        59        62        63        64        66        67        68
        72        74        75        78        81        83        84
        88        91        93        96        97        98       100
       101       102       105       106       108       113       114
       115       117       118       120       121       122       123
       125       126       128       129       131       132       133
       136       139       140       143       145       146       147
       148       149       153       155       156       158       159
       162       163       164       169       171       173       177
       178       180       181       186       187       188       189
       190       194       196       197       198       201       202
       203       204       206       207       209       210       211
       214       215       217       218       222       223       225
       226       229       230       231       233       234       237
       238       243       246       249       250       251       252
       255       257       258       260       262       263       266
       268       269       271       272       274       279       280
       281       286       287       288       289       292       296
       298       301       303       304       305       307       310
       311       313       314       318       320       321       325
       326       327       328       330       332       333       334
       335       336       338       342       344       347       349
       350       351       354       355       356       358       362
       363       364       366       368       372       374       377
       380       381       382       390       393       397       401
       403       412       413       414       421       422       423
       427       428       429       430       433       434       436
       437       440       441       442       443       445       448
       449       450       451       452       456       460       461
       463       466       469       470       477       478       482
       483       484       488       501       502       506       507
       510       512       515       520       523       525       526
       527       530       531       534       538       540       542
       543       544       549       551       553       556       558
       559       564       565       566       573       574       575
       576       577       579       580       581       582       583
       587       588       590       591       593       595       596
       597       598       599       600       601       603       604
       605       608       609       610       611       612       613
       614       616       620       621       622       625       628
       629       630       632       633       634       638       639
       642       643       648       649       651       653       654
       655       656       657       658       661       662       663
       664       665       667       670       671       672       673
       675       678       679       680       681       685       687
       689       690       692       693       694       696       697
       698       699       702       703       704       706       707
       711       718       721       725       727       728       729
       732       733       735       741       743       744       749
       750       753       754       755       757       759       763
       765       767       771       772       773       774       778
       780       781       783       784       786       788       791
       794       795       797       798       800       802       803
       804       805       806       810       811       812       814
       816       819       820       821       827       828       830
       831       832       834       836       838       839       841
       842       843       844       846       847       848       849
       850       851       852       854       857       859       862
       863       866       867       869       872       873       874
       878       880       882       884       888       889       893
       897       898       900       902       903       905       906
       907       908       910       911       912       914       916
       919       924       927       929       930       931       935
       936       939       940       943       944       945       946
       949       950       951       952       954       955       956
       957       959       961       962       963       964       965
       966       968       969       970       971       973       974
       975       976       977       978       980       981       982
       984       986       988       989       990       996       997
       998      1004      1005      1006      1007      1009      1010
      1015      1016      1020      1021      1022      1023      1025
      1026      1029      1030      1031      1033      1035      1037
      1038      1039      1041      1043      1044      1045      1046
      1047      1048      1049      1051      1052      1054      1056
      1058      1060      1061      1062      1064      1066      1069
      1070      1072      1074      1077      1079      1081      1084
      1085      1086      1088      1089      1090      1091      1092
      1093      1094      1096      1097      1098      1101      1102
      1104      1105      1106      1107      1108      1109      1111
      1113      1114      1116      1120      1121      1123      1125
      1126      1127      1128      1139     10671     10703     10704
 182305693 182305697 182305701 182305705 182305709 182305713 312782550
 312782554 312782558 312782562 312782566 312782570 312782578 312782582
 312782586 312782590 312782594 312782598 312782604 312782608 312782612
 312782616 312782620 312782624 312782632 312782636 312782640 312782644
 312782648 312782652 484682470 484682508 484682512 484682516 484682520
 484682524 484682528 496345664 496345668 496345672 526157192 526157196
 526322264 527696977 549009203 549009211 549009215 549009219 549009223
 549009227 560581551 560581559 560581563 563807435 563807439 576073699
 576073704 589508447 589508451 589508455 599626923 599626927 606826647
 606826651 606826655 606826659 606826663 607344830 607344834 607344838
 607344842 607344846 607344850 607344854 607344858 607344862 614454277]
ntustison commented 6 months ago

I use the allen sdk utilities as well but there's also an AFNI version floating around in what I have so I could've gotten it mixed up. Regardless, we should ignore what I wrote. One issue with your labels is that I don't see label 312782560 which is what you said everything merges to.

ntustison commented 6 months ago

In other words, perhaps that goes to @cookpa 's idea about what is happening.

cookpa commented 6 months ago

I think this might be it.

https://github.com/ANTsX/ANTsPy/blob/ff7413a5e7685b94ad7fc9a9b2553656f3bd005e/ants/registration/apply_transforms.py#L111-L114

The moving image is cloned to float, but the output image is a clone of the original without type specified. So antsApplyTransforms is trying to write its float output into a uint32 image array.

cookpa commented 6 months ago

To quickly test this, try cloning the moving image as float before the call to apply_transforms, then do new_atlas_int = new_atlas.clone('unsigned int')

FIrgolitsch commented 6 months ago

I tried with casting the atlas and the main image to float before the transformation but the problem still occurs. lsfm_template was already a float.

atlas_moving = ants.image_clone(atlas_allen, pixeltype="float")
image_fixed = ants.image_clone(image, pixeltype="float")
new_atlas = ants.apply_transforms(fixed=lsfm_template, moving=atlas_moving,
                                  transformlist=syn_transform_template['fwdtransforms'],
                                  interpolator="genericLabel")
new_atlas = ants.apply_transforms(fixed=image_fixed, moving=new_atlas,
                                  transformlist=syn_transform_image['invtransforms'],
                                  interpolator="genericLabel")

new_atlas_int = ants.image_clone(new_atlas, pixeltype="unsigned int")
new_atlas_int.to_file("test/S18_atlas_allen_int.nii")
ntustison commented 6 months ago

Are you checking your atlas labels before or after the line new_atlas_int = ants.image_clone(new_atlas, pixeltype="unsigned int")?

cookpa commented 6 months ago

Also, as a general point, I'd avoid the double interpolation by concatenating the warps into a single apply_transforms call. I have to work on other stuff now but can look more at this tonight or tomorrow

FIrgolitsch commented 6 months ago

@ntustison I checked both, and both have the problem. new_atlas is in float before saving.

@cookpa I tried the single apply transform call but things never seem to transform well. I always have a different result than when applying the transforms separately. In this case, the final atlas is shrunk as apposed to applying the transforms separately.

new_atlas = ants.apply_transforms(fixed=image_fixed, moving=atlas_moving,
                                  transformlist=[syn_transform_template['fwdtransforms'],
                                                           syn_transform_image['invtransforms']],
                                  interpolator="genericLabel")

This is the way to do it right?

ntustison commented 6 months ago

Compare the length of unique labels (and the set difference) between the SITK and ANTs read image functions where the latter appears to be losing labels:

ants:  619
sitk:  672
Diff:  [182305696 182305712 312782560 312782592 312782656 484682464 484682496
 526157184 526322272 527696992 549009216 560581568 563807424 576073728
 589508416 589508480 599626944 606826624 606826688 607344832 614454272]
from pathlib import Path
import os
import numpy as np
import ants
import SimpleITK as sitk
from allensdk.core.reference_space_cache import ReferenceSpaceCache

output_dir = '.'
reference_space_key = os.path.join('annotation', 'ccf_2017')
resolution = 25

rspc = ReferenceSpaceCache(resolution, reference_space_key, manifest=Path(output_dir)/'manifest.json')
rsp = rspc.get_reference_space("reference.nrrd", "annotation.nrrd")

ants_image = ants.image_read("annotation.nrrd")
ants_array = ants_image.numpy()
ants_unique_labels = np.unique(ants_array)
print("ants: ", len(ants_unique_labels))

sitk_image = sitk.ReadImage("annotation.nrrd")
sitk_array = sitk.GetArrayFromImage(sitk_image)
sitk_unique_labels = np.unique(sitk_array)
print("sitk: ", len(sitk_unique_labels))

print("Diff: ", np.setdiff1d(ants_unique_labels, sitk_unique_labels).astype('int'))
cookpa commented 6 months ago

Thanks @ntustison - it looks like it's definitely to do with float data type. When read explicitly as uint, it works

>>> allen = ants.image_read('allen_atlas_25.nrrd', pixeltype='unsigned int')
>>> ants_array = allen.numpy()
>>> ants_unique_labels = np.unique(ants_array)
>>> len(ants_unique_labels)
672

The simpleitk read method preserves the datatype of the input, so sitk_array has dtype=uint32 by default

cookpa commented 6 months ago

But then:

>>> ants_array_float = ants_array.astype('float')
>>> ants_unique_labels_float = np.unique(ants_array_float)
>>> len(ants_unique_labels_float)
672

EDIT: This works because the default 'float' in numpy is float64

ntustison commented 6 months ago

I guess the issue isn't restricted to ants:

>>> sitk_image = sitk.ReadImage("annotation.nrrd", sitk.sitkFloat32)
>>> sitk_array = sitk.GetArrayFromImage(sitk_image)
>>> sitk_unique_labels = np.unique(sitk_array)
>>> len(sitk_unique_labels)
619

with the only commonality being the float representation. So I confess ignorance---is there even a fix for this?

gdevenyi commented 6 months ago

I think this is a combination of the (insane) label value choices from the Allen, and inappropriate pixel type conversions.

Personally, when I worked with this labelset, I remapped the values into a much smaller and more sensible contiguous range and then worked with them.

cookpa commented 6 months ago

I was confused before, I used numpy's 'float' type which defaults to float64. With 32-bit floats, we see the same behavior

>>> a = np.asarray(614454277, dtype='uint32')
>>> af = a.astype('float32')
>>> af
array(6.144543e+08, dtype=float32)
>>> np.unique(af)
array([6.144543e+08], dtype=float32)
>>> af.astype('uint32')
array(614454272, dtype=uint32)

float32 can't represent such as large integer.

I think the only ways to work around this are either to remap the labels, or to use antsApplyTransforms on the command line with double precision (don't add --float).

cookpa commented 6 months ago

Regarding the transform list, I think the correct call would be

new_atlas = ants.apply_transforms(fixed=image, moving=atlas_allen, transformlist=[syn_transform_image['invtransforms'], syn_transform_template['fwdtransforms']] interpolator="genericLabel")

The ordering is counter-intuitive when one thinks about the transforms moving images from one space to another. Internally, the program is trying to construct a new image in the fixed space, and needs to warp sample points from the fixed to the moving space.

FIrgolitsch commented 6 months ago

I think the only ways to work around this are either to remap the labels, or to use antsApplyTransforms on the command line with double precision (don't add --float).

Thanks for the suggestion! Is there any reference for the command line of ANTs? I've only ever used AntsPy.

cookpa commented 6 months ago

Some examples here:

https://github.com/ANTsX/ANTs/wiki/Forward-and-inverse-warps-for-warping-images,-pointsets-and-Jacobians

You can build the code from source or get binaries here https://github.com/ANTsX/ANTs/releases/tag/v2.5.1

FIrgolitsch commented 6 months ago

Thank you very much! I tested some more with AntsPy too and seemed to have got it to work by editing the apply_transforms function directly in the package too. I changed the types casts to double:

fixed = fixed.clone('double')
moving = moving.clone('double')
warpedmovout = moving.clone('double')

and dropped the --float from the arguments as suggested:

processed_args = myargs + ['-z', str(1), '-v', str(myverb), str(1), '-e', str(imagetype), '-f', str(defaultvalue)]

Would it be possible to add an option to the function in the future? I'm using AntsPy as part of a processing pipeline and I would like to stick to Python without any manual command line calls.

cookpa commented 6 months ago

@FIrgolitsch this looks like a good idea.

@stnava @ntustison @ncullen93, one option would be to just use double in apply_transforms. Alternatively, we could try to use the most memory-efficient representation that avoids this problem

Input / output: unsigned char / float float / float unsigned int / double double / double

ncullen93 commented 6 months ago

Using double should be OK in my opinion.

ntustison commented 6 months ago

Sounds okay to me, too. But I defer to y'all.

cookpa commented 6 months ago

Thanks for the suggestion @FIrgolitsch - double is now an option for apply_transforms.

I have noticed though that other utils always use / return float, so there's a potential for loss of precision elsewhere (eg, I was just working on pad_image).

The Allen institute has an official function to map labels into the uint16 space, so they can be represented in ITK-SNAP. Might be a good convention to adopt if you need to do anything beyond apply_transforms.

https://allensdk.readthedocs.io/en/latest/_modules/allensdk/core/reference_space.html#ReferenceSpace.write_itksnap_labels