mne-tools / mne-bids

MNE-BIDS is a Python package that allows you to read and write BIDS-compatible datasets with the help of MNE-Python.
https://mne.tools/mne-bids/
BSD 3-Clause "New" or "Revised" License
131 stars 85 forks source link

BIDS conversion of anatomical scans #693

Open eort opened 3 years ago

eort commented 3 years ago

Hi,

I was trying to use mne-bids to BIDS-format anatomical scans and integrate it with the MEG part of the data set. In the process, I got a bit confused. My idea was to first create this bids dataset and later deal with the co-registration of head-, meg-, and mri spaces. So, basically, I expected to be able to produce a nifti file and associated side cars (scans.tsv and a sub-XX_T1w.json), based on the header info. However, mne-bids seems to produce only the nifti without any sidecars, unless I provide transform and/or landmark information, right?

I guess that makes sense from an MEG/MRI integrative point of view, but isn't it somewhat restricting to prevent all sidecars from being produced, because only some information are missing? I mean, all the MRI-related scanning parameters, as well as the scans.tsv should be independent from co-registration and could be processed regardless of whether transform info is present. In contrast, from your mri example, I get the impression that the only side cars that are produced in the mri-scan conversion are related to the co-registration, but maybe that is just for sake of illustration.

Aside from that, I am also unsure whether this approach is completely in line with the BIDS derivative vs. raw principle. Isn't co-registration a preprocessing step that is done on raw data and its products (transform files, etc.) should therefore be a derivative? Admittedly, I have only a rough idea of the process of co-registration, so maybe I am talking nonsense.

Finally, I was also wondering whether it would make sense, to provide support to read dicom scans. Right now, I have to convert a dicom to a nifti before I can use it with mne_bids. That extra step is a little awkward in the process. The api of write_anat says that the source image "Can be in any format readable by nibabel". Dicoms cannot directly be loaded with nibabel, but need some special treatment (see here). Not sure whether this can be integrated into the mne-bids framework?

sappelhoff commented 3 years ago

Just noticed that perhaps #617 is related

eort commented 3 years ago

However, mne-bids seems to produce only the nifti without any sidecars, unless I provide transform and/or landmark information, right?

Running the conversion with example data, I realized that the only information being written to the T1w.json are the coordinates of the anatomical landmarks. So, it makes sense, that when the trans file is left out, no json sidecar will be written (as it would be empty anyway).

Right now, I have to convert a dicom to a nifti before I can use it with mne_bids

By now I also have realized that I need to run freesurfer anyway to get the surfaces and be able to produce the trans.fif file. So, my idea what this write_anat function is supposed to do was somewhat off.

After being a bit more familiar with the matter, I see following things that could/should be done in the context of bidsifying anatomical data.

I believe i have read somewhere that writing derivatives is not a priority right now, but if this changes one day, it would make sense to add the Freesurfer output folder as a derivative to the dataset. Particularly as files in these folders are needed for some further analyses in the source space. In any case, adding dicom support to mne-bids is most likely overkill and not worth the effort.

{
  "AcquisitionMatrixPE": 256,
  "AcquisitionNumber": 1,
  "AcquisitionTime": "17:42:44.577500",
  "BaseResolution": 256,
  "BodyPartExamined": "BRAIN",
  "ConsistencyInfo": "N4_VE11C_LATEST_20160120",
  "ConversionSoftware": "dcm2niix",
  "ConversionSoftwareVersion": "v1.0.20201224",
  "DeviceSerialNumber": "166064",
  "DwellTime": 7.8e-06,
  "EchoTime": 0.00245,
  "FlipAngle": 9,
  "ImageOrientationPatientDICOM": [-0.0749788, 0.997185, 0, 0, 0, -1],
  "ImageType": [
    "ORIGINAL",
    "PRIMARY",
    "M",
    "NORM",
    "DIS3D",
    "DIS2D"
],
  "ImagingFrequency": 123.259,
  "InPlanePhaseEncodingDirectionDICOM": "ROW",
  "InstitutionAddress": "XXX",
  "InstitutionName": "XXX",
  "InstitutionalDepartmentName": "Department",
  "InversionTime": 0.9,
  "MRAcquisitionType": "3D",
  "MagneticFieldStrength": 3,
  "Manufacturer": "Siemens",
  "ManufacturersModelName": "Prisma",
  "Modality": "MR",
  "ParallelReductionFactorInPlane": 2,
  "PartialFourier": 1,
  "PatientPosition": "HFS",
  "PercentPhaseFOV": 100,
  "PercentSampling": 100,
  "PhaseEncodingSteps": 255,
  "PhaseResolution": 1,
  "PixelBandwidth": 250,
  "ProcedureStepDescription": "MR  Sch\u00e4del",
  "ProtocolName": "t1_mprage_tra_iso_neu",
  "PulseSequenceDetails": "%SiemensSeq%\\tfl",
  "ReceiveCoilActiveElements": "HE1-4;NE1,2",
  "ReceiveCoilName": "HeadNeck_20",
  "ReconMatrixPE": 256,
  "ReconstructionMethod": "\u00b7t\u00fd\u007f",
  "RefLinesPE": 24,
  "RepetitionTime": 2,
  "SAR": 0.0530676,
  "ScanOptions": "IR",
  "ScanningSequence": "GR\\IR",
  "SequenceName": "*tfl3d1_16",
  "SequenceVariant": "SK\\SP\\MP",
  "SeriesDescription": "t1_mprage_tra_iso_neu",
  "SeriesNumber": 2,
  "ShimSetting": [1012, -11573, -11477, 34, 1, -11, -25, -6],
  "SliceThickness": 1,
  "SoftwareVersions": "syngo MR E11",
  "StationName": "AWP166064",
  "TxRefAmp": 263.338}
sappelhoff commented 3 years ago

+1 to both your suggestions (especially the first one, which should be relatively straight forward)

adam2392 commented 3 years ago

Chiming in here, +1 for both suggestions as well.

Re adding more metadata from the Nifti file, that would be awesome. From my limited exp working with heudiconv, no package robustly extracts the BIDS metadata from a Nifti file (only from dicoms), so if we could have that as part of write_anat, that would make that function very awesome!

eort commented 3 years ago

Cool.

(especially the first one, which should be relatively straight forward)

Yup, I think so, too. The only thing that will be interesting is extracting the recording date, which might be a nice exercise to then also extract other information for the json sidecar.

no package robustly extracts the BIDS metadata from a Nifti file (only from dicoms)

Okay, I'll have a look into the options. But would adding heudiconv as a dependency be an option, or would it be better to keep mne-bids as lightweight as possible?

adam2392 commented 3 years ago

Re heudiconv: I don't think we need as dependency. I suppose if possible, see how they extract metadata and what they extract to see if can just replicate that in write_anat by reading in the nifti.hdr?

I think there is some loss of info from .dicoms -> .nii though, so it might not be possible to do it for all types of data though...

eort commented 3 years ago
Updating scans.tsv

Normally, the scans.tsv already exists when the anatomical scan is being read. So, I could write some code to read, update it and write it back to file again (as I already do with my own data). But, I guess the cleaner way would be to extend https://github.com/mne-tools/mne-bids/blob/f9eed4c34c06a3a71cbf092bc07d7593ffca0042/mne_bids/sidecar_updates.py#L16

to also be able to update .tsv files and not just .json files. So, would it make more sense to first invest some time to add .tsv support to that function, instead of diving straight into the specific write_anat issue?

eort commented 3 years ago

I think there is some loss of info from .dicoms -> .nii though, so it might not be possible to do it for all types of data though...

Indeed, the nifti header is much less comprehensive than the dicom header. Most relevant to this issue here, nifti headers don't seem to have information on acquisition time and date. Therefore, there is not much that can be done here with write_anat. In order to add date/time to scans.tsv and populate the anat sidecar with metadata we would need access to the original dicom. From there, I can see three scenarios:

1) process niftis and dicoms Let the write_anat function do its thing with the niftis as it currently does. Add support for reading dicom images and extracting relevant metadata that can then be used to populate scans.tsv and the json sidecar.

pros:

con:

2) only process dicoms Let mne-bids handle the entire conversion process, so not just extract metainformation for the sidecar, but also dicom2nifti conversion.

pros:

cons:

3) only process niftis, and makes use of pre-bidsified anatomical images Suggest that users do the BIDS-conversion of the anatomical image with their favorite tool (e.g. heudiconv), and pass the path to resulting BIDS directory to mne_bids, which then integrates the anat hierarchy with the mne-bids made hierarchy and extends the scans.tsv and json sidecar with the missing information. (That's my current approach).

pros:

cons:

4) leave everything as is and pretend this issue never existed Instead we could just inform/warn in the documentation that the information on the anatomical scan is incomplete and we recommend users to do something about it (maybe with giving some pointers on how to do it).

My preference is 3, 1, 4, 2 in that order. In any case, I think this issues deserves some additional information in the docs. Maybe as a info/warn box in the tutorial on how to write anatomical images. And maybe some nibabel-inspired disclaimer that this is experimental.

adam2392 commented 3 years ago

Imo (not a strong opinion tho), 3) seems the most manageable and easiest solution. We should point to the relevant heudiconv documentation for going from:

However, if heudiconv goes from dicoms -> BIDS imaging folders... what's the point of write_anat? It seems the only workflow that enables at that point is the writing of landmarks?

adam2392 commented 3 years ago

to also be able to update .tsv files and not just .json files. So, would it make more sense to first invest some time to add .tsv support to that function, instead of diving straight into the specific write_anat issue?

For reference: https://github.com/mne-tools/mne-bids/issues/634

updating tsv is a can of worms :p, but if we can do it, I am very excited.

eort commented 3 years ago

to also be able to update .tsv files and not just .json files. So, would it make more sense to first invest some time to add .tsv support to that function, instead of diving straight into the specific write_anat issue?

For reference: #634

updating tsv is a can of worms :p, but if we can do it, I am very excited.

haha, yeah, that will be fun. But well, depending on what we decide on regarding the dicom isse, it might not be necessary to mess with tsv files (at least for this issue)

eort commented 3 years ago

Imo (not a strong opinion tho), 3) seems the most manageable and easiest solution. We should point to the relevant heudiconv documentation for going from:

* dicoms -> BIDS imaging folders, or dicoms -> nifti images with sidecar jsons

* take those BIDS imaging folders and use `write_anat` to add some additional things encoded in the `raw`?

However, if heudiconv goes from dicoms -> BIDS imaging folders... what's the point of write_anat? It seems the only workflow that enables at that point is the writing of landmarks?

Right. Currently all that write_anat seems to do is writing the landmarks, renaming the nifti image and put it into the BIDS hierarchy. So, if we went for 3, only the landmark routine would be left, but instead, two BIDS directories would have to be merged.