DCAN-Labs / abcd-hcp-pipeline

bids application for processing functional MRI data, robust to scanner, acquisition and age variability.
https://hub.docker.com/r/dcanumn/abcd-hcp-pipeline
BSD 3-Clause "New" or "Revised" License
48 stars 19 forks source link

Exception: error caught during stage: DCANBOLDProcessing #76

Closed sunlianglong closed 10 months ago

sunlianglong commented 2 years ago

Dear all,

when I used the ABCD-HCP pipeline to preprocess two example subjects MRI data (HCP-style MRI for one subject, with fieldmap and T2w image; non HCP-style MRI without fieldmap and T2w for another), I got the same error during the running DCANBOLDProcessing step. The error is "Exception: error caught during stage: DCANBOLDProcessing". I check the logs, I find "_FileNotFoundError: [Errno 2] No such file or directory: '/output/sub-A00018030/ses-BAS1/files/MNINonLinear/Results/ses-BAS1_task-rest_run-01/DCANBOLDProc_v4.0.0/ses-BAS1_task-rest_run-01_DCANBOLDProc_v4.0.0Atlas.dtseries.nii' " in DCANBOLDProcessing_teardown.err and "_/opt/dcan-tools/dcan_bold_proc/bin/dcan_signalprocesssing: error while loading shared libraries: libmwmclmcrrt.so.9.2: cannot open shared object file: No such file or directory " in ses-BAS1_task-rest_run-01.err .

by the way, the docker file is the latest version in https://hub.docker.com/r/dcanumn/abcd-hcp-pipeline .

Looking for your replay!

singularity run -e \ -B /HeLabData2/llsun/NKIdataset/BIDS_Test/BIDSData:/bids_input \ -B /HeLabData2/llsun/NKIdataset/BIDS_Test/ABCDHCP_Results_CBDPtest:/output \ -B /HeLabData2/llsun/NKIdataset/license.txt:/opt/freesurfer/license.txt \ /HeLabData2/llsun/tool_packages/Container_Package/abcd-hcp-pipeline.sif \ /bids_input /output \ --participant-label CBDP0009A \ --freesurfer-license=/opt/freesurfer/license.txt --ncpus 8

The pipeline must choose distortion correction method based on the type(s) of field maps available. The type of fieldmaps you have are either not able to be used in the abcd-hcp-pipeline or they are not properly identified in the BIDS format. The pipeline does not account for 'phasediff', 'magnitude', and 'fieldmap' field maps filetypes yet. If you have 'phasediff' and 'magnitude' field maps, please provide the original 'phase1', 'phase2', 'magnitude1', and 'magnitude2' field maps used to calculate those files. The pipeline does the calculation itself.

running PreFreeSurfer /opt/pipeline/PreFreeSurfer/PreFreeSurferPipeline.sh \ --path=/output/sub-CBDP0009A/ses-None/files \ --subject=CBDP0009A \ --t1=/bids_input/sub-CBDP0009A/anat/sub-CBDP0009A_T1w.nii.gz \ --t2=/bids_input/sub-CBDP0009A/anat/sub-CBDP0009A_T2w.nii.gz \ --t1template=/opt/pipeline/global/templates/MNI152_T1_1mm.nii.gz \ --t1templatebrain=/opt/pipeline/global/templates/MNI152_T1_1mm_brain.nii.gz \ --t1template2mm=/opt/pipeline/global/templates/MNI152_T1_2mm.nii.gz \ --t2template=/opt/pipeline/global/templates/MNI152_T2_1mm.nii.gz \ --t2templatebrain=/opt/pipeline/global/templates/MNI152_T2_1mm_brain.nii.gz \ --t2template2mm=/opt/pipeline/global/templates/MNI152_T2_2mm.nii.gz \ --templatemask=/opt/pipeline/global/templates/MNI152_T1_1mm_brain_mask.nii.gz \ --template2mmmask=/opt/pipeline/global/templates/MNI152_T1_2mm_brain_mask_dil.nii.gz \ --brainsize=150 \ --fnirtconfig=/opt/pipeline/global/config/T1_2_MNI152_2mm.cnf \ --fmapmag=NONE \ --fmapphase=NONE \ --fmapgeneralelectric=NONE \ --echodiff=NONE \ --SEPhaseNeg=NONE \ --SEPhasePos=NONE \ --echospacing=NONE \ --seunwarpdir=NONE \ --t1samplespacing=0.000009301 \ --t2samplespacing=0.000002097 \ --unwarpdir=z \ --gdcoeffs=NONE \ --avgrdcmethod=NONE \ --topupconfig=/opt/pipeline/global/config/b02b0.cnf \ --useT2=true \ --printcom= \ --useStudyTemplate=false \ --StudyTemplate=NONE \ --StudyTemplateBrain=NONE running FreeSurfer /opt/pipeline/FreeSurfer/FreeSurferPipeline.sh \ --subject=CBDP0009A \ --subjectDIR=/output/sub-CBDP0009A/ses-None/files/T1w \ --t1=/output/sub-CBDP0009A/ses-None/files/T1w/T1w_acpc_dc_restore.nii.gz \ --t1brain=/output/sub-CBDP0009A/ses-None/files/T1w/T1w_acpc_dc_restore_brain.nii.gz \ --t2=/output/sub-CBDP0009A/ses-None/files/T1w/T2w_acpc_dc_restore.nii.gz \ --useT2=true \ --printcom= running PostFreeSurfer /opt/pipeline/PostFreeSurfer/PostFreeSurferPipeline.sh \ --path=/output/sub-CBDP0009A/ses-None/files \ --subject=CBDP0009A \ --surfatlasdir=/opt/pipeline/global/templates/standard_mesh_atlases \ --grayordinatesdir=/opt/pipeline/global/templates/91282_Greyordinates \ --grayordinatesres=2 \ --hiresmesh=164 \ --lowresmesh=32 \ --subcortgraylabels=/opt/pipeline/global/config/FreeSurferSubcorticalLabelTableLut.txt \ --freesurferlabels=/opt/pipeline/global/config/FreeSurferAllLut.txt \ --refmyelinmaps=/opt/pipeline/global/templates/standard_mesh_atlases/Conte69.MyelinMap_BC.164k_fs_LR.dscalar.nii \ --regname=MSMSulc \ --reference2mm=/opt/pipeline/global/templates/MNI152_T1_2mm.nii.gz \ --reference2mmmask=/opt/pipeline/global/templates/MNI152_T1_2mm_brain_mask_dil.nii.gz \ --config=/opt/pipeline/global/config/T1_2_MNI152_2mm.cnf \ --useT2=true \ --t1template=/opt/pipeline/global/templates/MNI152_T1_1mm.nii.gz \ --t1templatebrain=/opt/pipeline/global/templates/MNI152_T1_1mm_brain.nii.gz \ --t1template2mm=/opt/pipeline/global/templates/MNI152_T1_2mm.nii.gz \ --t2template=/opt/pipeline/global/templates/MNI152_T2_1mm.nii.gz \ --t2templatebrain=/opt/pipeline/global/templates/MNI152_T2_1mm_brain.nii.gz \ --t2template2mm=/opt/pipeline/global/templates/MNI152_T2_2mm.nii.gz \ --templatemask=/opt/pipeline/global/templates/MNI152_T1_1mm_brain_mask.nii.gz \ --template2mmmask=/opt/pipeline/global/templates/MNI152_T1_2mm_brain_mask_dil.nii.gz \ --useStudyTemplate=false \ --printcom= running FMRIVolume /opt/pipeline/fMRIVolume/GenericfMRIVolumeProcessingPipeline.sh \ --path=/output/sub-CBDP0009A/ses-None/files \ --subject=CBDP0009A \ --fmriname=task-rest_run-1 \ --fmritcs=/bids_input/sub-CBDP0009A/func/sub-CBDP0009A_task-rest_run-1_bold.nii.gz \ --fmriscout=NONE \ --SEPhaseNeg=NONE \ --SEPhasePos=NONE \ --fmapmag=NONE \ --fmapphase=NONE \ --fmapgeneralelectric=NONE \ --echospacing=NONE \ --echodiff=NONE \ --unwarpdir=NONE \ --fmrires=2.0 \ --dcmethod=NONE \ --gdcoeffs=NONE \ --topupconfig=/opt/pipeline/global/config/b02b0.cnf \ --printcom= \ --biascorrection=NONE \ --mctype=MCFLIRT \ --useT2=true

running FMRISurface /opt/pipeline/fMRISurface/GenericfMRISurfaceProcessingPipeline.sh \ --path=/output/sub-CBDP0009A/ses-None/files \ --subject=CBDP0009A \ --fmriname=task-rest_run-1 \ --lowresmesh=32 \ --fmrires=2.0 \ --smoothingFWHM=2 \ --grayordinatesres=2 \ --regname=MSMSulc

running DCANBOLDProcessing /opt/dcan-tools/dcan_bold_proc/dcan_bold_proc.py \ --subject=CBDP0009A \ --output-folder=/output/sub-CBDP0009A/ses-None/files \ --task=task-rest_run-1 \ --fd-threshold=0.3 \ --filter-order=2 \ --lower-bpf=0.009 \ --upper-bpf=0.08 \ --motion-filter-type=notch \ --physio=None \ --motion-filter-option=5 \ --motion-filter-order=4 \ --band-stop-min=None \ --band-stop-max=None \ --brain-radius=50 \ --skip-seconds=5 \ --contiguous-frames=5

Traceback (most recent call last): File "/app/run.py", line 374, in _cli() File "/app/run.py", line 68, in _cli return interface(**kwargs) File "/app/run.py", line 370, in interface stage.run(ncpus) File "/app/pipelines.py", line 588, in run self.teardown(result) File "/app/pipelines.py", line 941, in teardown super(class, self).teardown(result) File "/app/pipelines.py", line 534, in teardown self.class.name) Exception: error caught during stage: DCANBOLDProcessing

madisoth commented 2 years ago

Hello and thank you for reporting that error-- it is a known issue in the v0.1.1 / "latest" release and we intend to have a fixed version released shortly; apologies for the trouble.

In the meantime, the prior release (v0.1.0) should work. v0.1.1 only changed the included Matlab runtime version (which is the cause of your error), and fixed a bandstop filter bug which wouldn't affect your dataset since you are not using the filter.

Also if you still have the incomplete output, you should be able to run v0.1.0 with --stage DCANBOLDProcessing to resume from where the error was occuring. If with v0.1.0 you still have issues let us know. Thanks!

sunlianglong commented 2 years ago

Hello Madison, thanks ! It worked in the prior release (v0.1.0).

Will the next release in docker hub fix this bug?

arueter1 commented 2 years ago

@sunlianglong - we are checking on when the next docker hub fix is coming. More to come soon. Thanks for your patience!

madisoth commented 10 months ago

Should be resolved as of the latest release (v0.1.3 at this time)