nipreps / fmriprep

fMRIPrep is a robust and easy-to-use pipeline for preprocessing of diverse fMRI data. The transparent workflow dispenses of manual intervention, thereby ensuring the reproducibility of the results.
https://fmriprep.org
Apache License 2.0
630 stars 292 forks source link

--bids-filter-file seems not working as supposed #2368

Open GongZhengxin opened 3 years ago

GongZhengxin commented 3 years ago

I want to only run files in session ses-LOC with task-retinotopy. I wrote my Filter.json as:

{ "t1w": { "datatype": "anat", "session": "ImageNet02", "acquisition": null, "suffix": "T1w" }, "bold": { "datatype": "func", "session": "LOC", "task": "retinotopy", "suffix": "bold" } }

Becasue the doc on this method is sort of ambiguous to me, I am not sure about my usage. And my cml follows:

fmriprep-docker $bids_fold $out_dir participant \ --skip-bids-validation --participant-label core02 \ --bids-filter-file $home/Filter.json \ --fs-license-file $license_file \ --output-spaces anat MNI152NLin6Asym:res-2 fsLR \ --cifti-output 91k -w $work_dir

But it didn't work well, fmriprep actually loading all the existed files in all session folds with all task labels, it really bothers me a lot.

BTW few more questions on fmriprep: when --use-aroma is set on, the fmriprep cifti output will generated by which of the volume datas? (i.e. The denosied one ~desc-smoothAROMAnonaggr_bold.nii.gz or the couterpart ~desc-preproc_bold.nii.gz?)

htwangtw commented 3 years ago

I believe I have run into the same problem. Here's my json file:

{
  "bold": 
  {
    "datatype": "func",
    "session": "TRT",
    "acquisition": "645",
    "suffix": "bold"
  },
  "t1w": {
    "datatype": "anat",
    "session": "TRT",
    "suffix": "T1w"
  }
}

If there's a problem with the content of this filter file, it's not clear to me compared with the example in the doc. Along the same line, I have one suggestion: fMRIprep can raise a syntax error when passing an invalid json file.

effigies commented 3 years ago

It's possible that there's a bug here. Are you using public data, or something you should share?

htwangtw commented 3 years ago

I am running on the enhanced NKI dataset. For now I just delete all irelevant data for my analysis as a workaround.

Here's the full SGE cluster submission script:

#!/bin/bash
#$ -N dbg_fmriprep 
#$ -pe openmp 8
#$ -l m_mem_free=4G
#$ -l 'h=!node069&!node077&!node076' 

DATA_DIR=/research/cisc2/projects/myproject/enhanced-nki-dataset
SCRATCH_DIR=/research/cisc1/projects/myproject

singularity run --cleanenv \
                -B ${DATA_DIR}/data:/data \
                -B ${SCRATCH_DIR}/:/out \
        -B ${SCRATCH_DIR}/wd:/wd \
                ${HOME}/singularity-images/fmriprep-20.2.1.simg \
                --skip_bids_validation \
                --participant-label A00027167 \
        --bids-filter-file ${DATA_DIR}/tools/data/sub-A00027167_filter.json \
        --omp-nthreads 4 --nthreads 6 \
        --output-spaces MNI152NLin2009cAsym:res-2 \
                --fs-license-file ~/singularity-images/freesurfer_license.txt \
                --work-dir /wd \
                /data /out/ participant

I killed the job when I realise the filter was not working so I cannot provide the full output. There's this warning from PyBIDS:

/usr/local/miniconda/lib/python3.7/site-packages/bids/layout/validation.py:46: UserWarning: The ability to pass arguments to BIDSLayout that control indexing is likely to be removed in future; possibly as early as PyBIDS 0.14. This includes the `config_filename`, `ignore`, `force_index`, and `index_metadata` arguments. The recommended usage pattern is to initialize a new BIDSLayoutIndexer with these arguments, and pass it to the BIDSLayout via the `indexer` argument.

Edit: the above warning will appear regardless one using the filter or not.

bpinsard commented 3 years ago

Sorry for jumping in late here. I think the problem should be fixed by https://github.com/nipreps/fmriprep/pull/2331 Without that fix, a non-existing file (or a file non-accessible to the docker/singularity container) would fail silently. It was also failing if the JSON was not valid. We should add it to the next LTS minor.