Hendrik-code / spineps

This is a segmentation pipeline to automatically, and robustly, segment the whole spine in T2w sagittal images.
Apache License 2.0
15 stars 2 forks source link

Error for T1 dataset #35

Open kartik7737 opened 2 days ago

kartik7737 commented 2 days ago

Describe the bug I am running the spineps for t1 dataset, it runs smoothly for t2_dataset but giving the error assert.affine errors for zoom, origin and shape of the segmentation generated and input MR. the error report on the terminal: (/physical_sciences/spineps/spineps-env) [kkumar@papr-res-gpu01 spineps]$ spineps dataset -i "/physical_sciences/spineps/Temp_data/Dataset/dataset-T1" -model_semantic T1w_Segmentor -model_instance Inst_Vertebra_3.0 -smrm -ibf -imc ───────────────────────────────────────────────────── Thank you for using SPINEPS ────────────────────────────────────────────────────── Please support our development by citing GitHub: https://github.com/Hendrik-code/spineps ArXiv: https://arxiv.org/abs/2402.16368 Thank you! ──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────── [Models] Check available models... Namespace(cmd='dataset', directory='/physical_sciences/spineps/Temp_data/Dataset/dataset-T1', raw_name='rawdata', model_semantic='T1w_Segmentor', model_instance='Inst_Vertebra_3.0', ignore_bids_filter=True, ignore_model_compatibility=True, save_log=False, save_snaps_folder=False, der_name='derivatives_seg', save_debug=False, save_softmax_logits=False, save_modelres_mask=True, override_semantic=False, override_instance=False, override_postpair=False, override_ctd=False, ignore_inference_compatibility=False, nocrop=False, non4=False, cpu=False, run_cprofiler=False, verbose=False) /physical_sciences/spineps/spineps-env/spineps/spineps/utils/predictor.py:85: FutureWarning:

You are using torch.load with weights_only=False (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for weights_only will be flipped to True. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via torch.serialization.add_safe_globals. We recommend you start setting weights_only=True for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.

[] Model loaded from /physical_sciences/spineps/spineps-env/spineps/spineps/models/T1w_Segmentor [] Model loaded from /physical_sciences/spineps/spineps-env/spineps/spineps/models/Inst_Vertebra_3.0 [SPINEPS] Initialize setup for dataset in /physical_sciences/spineps/Temp_data/Dataset/dataset-T1 [SPINEPS] (Modality.T2w, Acquisition.sag): model incompatible, model modalities [Modality.T1w] [SPINEPS] Processing dataset in /physical_sciences/spineps/Temp_data/Dataset/dataset-T1 [SPINEPS] Found 19 Subjects in /physical_sciences/spineps/Temp_data/Dataset/dataset-T1, parents=['rawdata', 'derivatives_seg']

[SPINEPS] Processing 1 / 19 subject: 12a [SPINEPS] Subject 1: 12a had no scans to be processed

[SPINEPS] Processing 2 / 19 subject: 12b [SPINEPS] Subject 2: 12b had no scans to be processed

[SPINEPS] Processing 3 / 19 subject: 14a [SPINEPS] Processing sub-14a__20190827_Sag_T1_Spine_Gd_Comp_scan_1_T1w.nii.gz

I think there is some problem in running the pre-processing of T1 MR images which changes the resolution , shape and origin of the input mr and assert.affine function is failing.

kartik7737 commented 2 days ago

[SPINEPS] Processing sub-14a__20190827_Sag_T1_Spine_Gd_Comp_scan_1_T1w.nii.gz

Hendrik-code commented 2 days ago

Hey very interesting problem. The thing is, if the assertion compares the wrong things, this should always fail, also for my data, which it doesn't. Can you send me the scan in question? It would make my debugging faster and easier. Anyway, thanks for bringing this up! =)

kartik7737 commented 2 days ago

Hi Thanks you for the quick reply and for this amazing work. I tried running the code using the spineps sample on individual T1 files, it works. This issue happens when I tried to run on whole dataset

spineps dataset -i "/physical_sciences/spineps/Temp_data/Dataset/dataset-T1" -model_semantic "/physical_sciences/spineps/spineps-env/spineps/spineps/models/T1w_Segmentor" -model_instance "/physical_sciences/spineps/spineps-env/spineps/spineps/models/Inst_Vertebra_3.0" -v

fails to recognize T1 models and give the following error:

[SPINEPS] Processing 14 / 19 subject: 4b [SPINEPS] Subject 14: 4b had no scans to be processed

[SPINEPS] Processing 15 / 19 subject: 4c [SPINEPS] Subject 15: 4c had no scans to be processed

[SPINEPS] Processing 16 / 19 subject: 4d [SPINEPS] Subject 16: 4d had no scans to be processed

[SPINEPS] Processing 17 / 19 subject: 4e [SPINEPS] Subject 17: 4e had no scans to be processed

[SPINEPS] Processing 18 / 19 subject: 4f [SPINEPS] Subject 18: 4f had no scans to be processed

[SPINEPS] Processing 19 / 19 subject: 4g [SPINEPS] Subject 19: 4g had no scans to be processed

[SPINEPS] Processed 0 scans with [(Modality.T2w, Acquisition.sag)]

If I run the same command using (using -imc flag) spineps dataset -i "/physical_sciences/spineps/Temp_data/Dataset/dataset-T1" -model_semantic "/physical_sciences/spineps/spineps-env/spineps/spineps/models/T1w_Segmentor" -model_instance "/physical_sciences/spineps/spineps-env/spineps/spineps/models/Inst_Vertebra_3.0" -v -imc

it gives the error that I mentioned in my first comment.

kartik7737 commented 2 days ago

These scans are part of clinical dataset and cannot share them because of ethics :(

Hendrik-code commented 2 days ago

Hey, your dataset is probably not bids-conform, so what happens when you run this with both -imc and -ibf flag? But it always works if you run it on only one of those files? This is strange, because the "running on dataset" pipeline also just runs the single processing under the hood... I will have a closer look later.