MASILab / Synb0-DISCO

Distortion correction of diffusion weighted MRI without reverse phase-encoding scans or field-maps
https://my.vanderbilt.edu/masi
58 stars 29 forks source link

Error running episodes_reg #39

Closed Leprimaire closed 1 year ago

Leprimaire commented 1 year ago

Hello, I try to use Synbo-DISCO for the first time with Docker. I set my T1raw + first b0 +acqparams.txt (0 -1 0 0.0351891 and create the second line : 0 -1 0 0.000).

Finally it show: Failed to read volume /OUTPUTS/b0_all.nii.gz Error : No image files match: /OUTPUTS/b0_all

Looking inside the process, the first error seem to occur at this time, line 320, when running the FAST segmentation:

epi_reg distorted b0 to T1 epi_reg --epi=/INPUTS/b0.nii.gz --t1=/INPUTS/T1.nii.gz --t1brain=/tmp/tmp.M7wXzCLbwv/T1_mask.nii.gz --out=/tmp/tmp.M7wXzCLbwv/epi_reg_d Running FAST segmentation /extra/fsl/bin/epi_reg: line 320: 1401 Killed $FSLDIR/bin/fast -o ${vout}_fast ${vrefbrain} Image Exception : #63 :: No image files match: /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_pve_2 terminate called after throwing an instance of 'std::runtime_error' what(): No image files match: /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_pve_2 /extra/fsl/bin/epi_reg: line 320: 1402 Aborted $FSLDIR/bin/fslmaths ${vout}_fast_pve_2 -thr 0.5 -bin ${vout}_fast_wmseg Image Exception : #63 :: No image files match: /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_wmseg terminate called after throwing an instance of 'std::runtime_error' what(): No image files match: /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_wmseg /extra/fsl/bin/epi_reg: line 329: 1427 Aborted $FSLDIR/bin/fslmaths ${vout}_fast_wmseg -edge -bin -mas ${vout}_fast_wmseg ${vout}_fast_wmedge FLIRT pre-alignment Running BBR Image Exception : #63 :: No image files match: /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_wmseg Image Exception : #22 :: Failed to read volume /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_wmseg Error : No image files match: /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_wmseg Failed to read volume /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_wmseg Error : No image files match: /tmp/tmp.M7wXzCLbwv/epi_reg_d_fast_wmseg Could not open matrix file /tmp/tmp.M7wXzCLbwv/epi_reg_d.mat /extra/fsl/bin/epi_reg: line 399: 1430 Segmentation fault $FSLDIR/bin/applywarp -i ${vepi} -r ${vrefhead} -o ${vout} --premat=${vout}.mat --interp=spline

Do you know how can I fix this?

Thank you very much Charles

Diffusion-MRI commented 1 year ago

Can you send the full output displayed or written in the terminal? I believe the error may be happening before the lines you've copied. Can you also send the names of the files within the INPUTS folder?

Leprimaire commented 1 year ago

Yes, but I just left the lab, I send it to you Monday morning !

Thank you for this extremely fast answer !

Best, Charles

Le ven. 9 déc. 2022, 19:37, Diffusion-MRI @.***> a écrit :

Can you send the full output displayed or written in the terminal? I believe the error may be happening before the lines you've copied. Can you also send the names of the files within the INPUTS folder?

— Reply to this email directly, view it on GitHub https://github.com/MASILab/Synb0-DISCO/issues/39#issuecomment-1344968636, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4EAWTDA3GSPO3DB7T5QXSDWMPNFBANCNFSM6AAAAAASZ5PQXI . You are receiving this because you authored the thread.Message ID: @.***>

Leprimaire commented 1 year ago

Hello,

The inputs names are acqparams.txt, b0.nii.gz, T1.nii.gz and the full output is:

The full log is attached

log.txt

Diffusion-MRI commented 1 year ago

strange - it seems to fail right at the beginning with T1 processing. Can you confirm that your file paths are correct and also that the T1 is a 3D image that is not corrupted?

Leprimaire commented 1 year ago

I work with Mac Terminal, in a directory where I put INPUTS (with the 3 files needed) ands OUTPUTS (empty). I have a 7T anatomical T1 wheighted 3D file processed with dcm2iix from raw data (non skull stripped), easy readable with microGL 3 and a b0 file (fslroi 0 1 with my dwi 4D file)

Leprimaire commented 1 year ago

In OUTPUTS I get the T1_mask, norm, norm_lin, norm_nonlin, the b0_d_smooth and the 3 ANTS files. So files look to be readable

Diffusion-MRI commented 1 year ago

Do the files seem reasonable upon inspection? I'm surprised those files exist, the very first error encountered is the very early in the pipeline when the T1 is being read "cp: cannot stat '/INPUTS/T1.nii.gz': No such file or directory". This usually means corrupted file or the container can't see or write to the INPUTS path. Have you successfully ran this pipeline before? I'm asking because if not, this is likely an error related to running the container. If so, it is likely an error related to your file itself.

Leprimaire commented 1 year ago

Ok thank you a lot. First time I run this pipeline. Yes the skull stripped file look good and the b0 is smoothed. I will try another machine to make it.

Leprimaire commented 1 year ago

Do you have a sample dataset I can use to test with my setup?

Leprimaire commented 1 year ago

Hello again ! I reinstalled my Docker configuration but I get stuck one step later. The T1 is now readable but the process crash while FAST segmentation is being processed.

Do you have an idea to debug it? log 2.txt

schillkg commented 1 year ago

Hi - let's try two things. First, can you show an example of your T1 image? Second, can you test on an example dataset - we have run this on many example datasets from openneuro, as well as all the datasets shown in Figure 8 in this paper (https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0236418) which should be freely available following the appropriate links.

Leprimaire commented 1 year ago

This is my T1.nii.gz image. I will test it with the dataset, thank you !!!

Screen Shot 2023-01-04 at 6 12 33 PM
Leprimaire commented 1 year ago

Hi, Finally, it works! I would like to thank you so much for your help and provide feedback if it helps anyone.

I ran it with 2 example datasets, and the script crashed much later in the process compared to my own one. So I figured out that the problem was with my Docker configuration (yet I gave 8.5GB of RAM as recommended). I gave Docker all the resources my computer could get and it finally works (16GB, 4 processors, 4GB SWAP).