rordenlab / dcm2niix

dcm2nii DICOM to NIfTI converter: compiled versions available from NITRC
https://www.nitrc.org/plugins/mwiki/index.php/dcm2nii:MainPage
Other
872 stars 229 forks source link

Converting DICOM to nii leads to a different origin #718

Closed felenitaribeiro closed 1 year ago

felenitaribeiro commented 1 year ago

Hi, I have a bug when converting DICOM data to nifti. Specifically, the nifti file seems to be flipped when I load the segmentation file. We noticed that when we load both DICOM and nifti files on ITK snap, the origins of the images are different, as shown in the "image information" menu; the y coordinated has a different sign and is a bit different in magnitude.

I got this behavior when converting NCI prostate data (https://wiki.cancerimagingarchive.net/pages/viewpage.action?pageId=21267207) to nii:

  1. To acquire the data, you first need to install their data retriever (https://wiki.cancerimagingarchive.net/display/NBIA/Downloading+TCIA+Images);
  2. Download the manifest file (.tcia), the "Image Data - Training (DICOM, 60 subjects)" here https://wiki.cancerimagingarchive.net/pages/viewpage.action?pageId=21267207. I only used the 3T data.
  3. Open the manifest file with the data retriever and download the data.
  4. You can then use dcm2niix on this data to reproduce the behavior. Then, load the underlying image (nifti or DICOM) with their segmentation ("NCI Challenge Data - Training (NRRD, 60 subjects)"). Image and segmentation will be well aligned for DICOM, but not for nifti (labels will appear flipped). I've also tried using the dicom2nifti Python package, which behaves the same way. Strangely, if I convert the DICOM files to minc and then to nifti, everything is correct.

We tested the dcm2niix on bidstools on Neurodesk and the latest dcm2niix.

neurolabusc commented 1 year ago

I am unable to replicate. Here is the DICOM image ProstateDx-03-0001 shown as a DICOM with AlizaMS, and as a NIfTI with the NRRD ProstateDx-03-0001_truth.nrrd shown as an overlay using both ITK-SNAP 4.0 and MRIcroGL.

I wonder if your confusion stems from the coordinate systems used by NIfTI versus DICOM. For NIfTI the X coordinate increases as you move right, the Y increases anteriorly and the Z increases superiorly (RAS). With DICOM, the X increases as you go left, the Y increases as you move posteriorly and Z increases superiorly (LPS).

dcm2

felenitaribeiro commented 1 year ago

Hi, thank you for taking the time to look at it.

Just to make sure we are looking at the same data, I used Prostate3T-01-0001 (/manifest-.../Prostate-3T/Prostate3T-01-0027/01-30-2003-NA-MR...), for example. On Neurodesk, I module loaded bidstools and ran dmc2niix on the folder with the DICOM files. Here is what I get when I load this generated nifti file with Prostate3T-01-0027.nrrd. image and the image origin: image

When I load the DICOM files with Prostate3T-01-0027.nrrd, it is perfect. image and here is the origin: image

neurolabusc commented 1 year ago

@felenitaribeiro this is an issue with the ITK-snap Segmentation/OpenSegmentation menu item, where it does not re-orient the segmentation layer (e.g. Prostate3T-01-0027.nrrd) to match the space of the background image. Notice that if you use the File/AddAnotherImage menu item, select the file Prostate3T-01-0027.nrrd and choose to display this as a transparent overlay the nrrd file is correctly oriented.

Screenshot 2023-06-13 at 8 07 28 PM Screenshot 2023-06-13 at 8 05 10 PM
felenitaribeiro commented 1 year ago

I also encountered the same problem when using matplotlib on Python:

import matplotlib.pyplot as plt
import nibabel as nib
import numpy as np

# NCI Prostate dataset
img = nib.load('path_to_data/4.000000-t2tsetra320p2-89418_t2_tse_tra_320_p2_20030130130650_4.nii').get_fdata()
label = nib.load('path_to_data/labels/Prostate3T-01-0027.nii.gz').get_fdata()

plt.imshow(
    np.rot90(img[:, :, 10]), cmap='gray')
plt.imshow(
    np.rot90(label[:, :, 10]), cmap='magma', alpha=.2, vmax = 3)
plt.show()

image

But if I convert the DICOM files to minc (using dcm2mnc after module loading minc on Neurodesk) and nifti (using mnc2nii) and then visualize the underlying image (.nii) with Prostate3T-01-0027.nrrd as segmentation, I don't find the mismatch in either ITK-snap or using matplotlib.

Thank you!

neurolabusc commented 1 year ago

@felenitaribeiro these are issues with ITK-SNAP and matplotlib. The core issue is that the NIfTI specification uses the reverse row order compared to DICOM and NRRD. For DICOM, MINC, and NRRD the first row is at the top of the page with subsequent lines below the first, the way we write English. For NIfTI, the first line is at the bottom of the page, with higher row numbers being further up the page, the way we plot the Y axis of a cartesian graph. Tools that ignore the spatial transform and ignore the difference will demonstrate the errors you describe. It appears that matplotlib and ITK-SNAP are simply loading values as stored on disk.

You can see in your Python code that you are only plotting the raw voxel data, so it can only display the row and column order as loaded, without any consideration regarding the spatial transforms. If all your images are in RAS NIfTI and all your labels are in LPS NRRD you could flip your image data. However, note that dcm2niix will preserve slice order for EPI series (to retain slice timing), so this would only work for axial acquisitions.

In theory, you can demand that dcm2niix preserves the DICOM's row order by disabling the flipping of the row order (`-y n). The sform/qform spatial transforms will encode this, so this should be lossless for tools that correctly implement the NIfTI format, but row order will appear reversed for dumb tools that merely load data as loaded from disk. Be aware that this feature of dcm2niix is not well tested (I am not 100% sure it reports diffusion bvecs and phase-encoding details correctly), so use this with caution:

dcm2niix -y n /path/to/DICOMs

In brief, you are experiencing issues with the tools you are using, not a limitation of dcm2niix.

grid

felenitaribeiro commented 1 year ago

@neurolabusc Thanks for the clarification!