nipreps / fmriprep

fMRIPrep is a robust and easy-to-use pipeline for preprocessing of diverse fMRI data. The transparent workflow dispenses of manual intervention, thereby ensuring the reproducibility of the results.
https://fmriprep.org
Apache License 2.0
637 stars 295 forks source link

OSError: Duplicate node name "func_preproc_*_wf" found. #3122

Closed lsempf closed 11 months ago

lsempf commented 1 year ago

What happened?

Hi,

I am currently trying to re-preprocess one subject fMRI dataset. FMRIprep ran through in a first run. However, since there were some errors with this subject (regarding the raw data), I wanted to preprocess the subject again. Even though I deleted the first fmriprep output, I cannot process the subject again.

I repeatedly get this error:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
    self.run()
  File "/opt/conda/lib/python3.9/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/conda/lib/python3.9/site-packages/fmriprep/cli/workflow.py", line 115, in build_workflow
    retval["workflow"] = init_fmriprep_wf()
  File "/opt/conda/lib/python3.9/site-packages/fmriprep/workflows/base.py", line 92, in init_fmriprep_wf
    single_subject_wf = init_single_subject_wf(subject_id)
  File "/opt/conda/lib/python3.9/site-packages/fmriprep/workflows/base.py", line 434, in init_single_subject_wf
    workflow.connect([
  File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/workflows.py", line 161, in connect
    self._check_nodes(newnodes)
  File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/workflows.py", line 769, in _check_nodes
    raise IOError('Duplicate node name "%s" found.' % node.name)
OSError: Duplicate node name "func_preproc_ses_001_task_passive_run_1_wf" found.

Does fMRIprep write anything to the subject's BIDS folder? I had this problem some time ago and deleting the BIDS folder and recreating it (via heudiconv) solved the problem. But I would like to be able to solve the problem differently.

Many thanks for your help!

important note: I do not use DataLad or git and I removed the previous preprocessing even from the cluster.

What command did you use?

export SINGULARITYENV_FS_LICENSE=/opt/freesurfer/license.txt
export SINGULARITY_BINDPATH="/home/data:/home/data,$FS_LICENSE:$SINGULARITYENV_FS_LICENSE"

env -i SINGULARITY_BINDPATH="$SINGULARITY_BINDPATH" SINGULARITYENV_FS_LICENSE="$SINGULARITYENV_FS_LICENSE" /usr/bin/singularity run fmriprep-23.0.2.sif \
    --fs-license-file freesurfer_license.txt \
    --nprocs $req_cpu \
    --mem $req_ram \
    --skip_bids_validation \
    --slice-time-ref 0.5 \
    --dummy-scans 3 \
    --fs-no-reconall \
    --notrack \
    -vvv \
    $dir_bids \
    $dir_output \
    participant --participant-label $subject \
    --work $dir_work_root \

What version of fMRIPrep are you running?

fMRIPrep version 23.0.2

How are you running fMRIPrep?

Singularity

Is your data BIDS valid?

No

Are you reusing any previously computed results?

No

Please copy and paste any relevant log output.

No response

Additional information / screenshots

No response

effigies commented 1 year ago

We only support valid BIDS datasets. This is probably a case where you have a zipped and unzipped nifti together.