compneuro-ncu / fmridenoise

Tool for automatic denoising, denoising strategies comparison, and functional connectivity data quality control.
Apache License 2.0
47 stars 13 forks source link

Filter fails on very short runs #32

Open wiheto opened 5 years ago

wiheto commented 5 years ago

In very short runs (in my case these runs will be excluded but are still part of the BIDS structure) the Butterworth filter is failing cause there are too few time points.

Unless I am missing something obvious, you can not ignore certain files. Perhaps adding something an --ignore-file flag(s) which could be used like:

fmridenoise ..... --ignore-file sub-02 run-04

Would exclude all files that match both arguments. Then you could include multiple ignore files:

fmridenoise ...... --ignore-file sub-02 run-04 --ignore-file sub-04 task-rest --ignore-file task-test

Not sure if this is the best way to implement it but if it sounds ok, I can have a go at implementing it, because its something I need.

Crash report:

File: /home/william/packages/fmridenoise/crash-20190924-203041-william-Denoiser.a4-535c42e6-7063-4eb1-865e-0de0573621bb.pklz
Node: fmridenoise_wf.Denoiser
Working directory: /tmp/fmridenoise/fmridenoise_wf/_pipeline_path_..home..william..packages..fmridenoise..fmridenoise..pipelines..pipeline-Null.json/Denoiser

Node inputs:

conf_prep = ['/tmp/fmridenoise/prep_conf/sub-403_ses-postop_task-es_run-01_desc-confounds_regressors_prep_pipeline-Null.tsv', '/tmp/fmridenoise/prep_conf/sub-403_ses-postop_task-es_run-02_desc-confounds_regressors_prep_pipeline-Null.tsv', '/tmp/fmridenoise/prep_conf/sub-403_ses-postop_task-es_run-03_desc-confounds_regressors_prep_pipeline-Null.tsv', '/tmp/fmridenoise/prep_conf/sub-403_ses-postop_task-es_run-04_desc-confounds_regressors_prep_pipeline-Null.tsv', '/tmp/fmridenoise/prep_conf/sub-403_ses-postop_task-es_run-05_desc-confounds_regressors_prep_pipeline-Null.tsv', '/tmp/fmridenoise/prep_conf/sub-403_ses-postop_task-es_run-06_desc-confounds_regressors_prep_pipeline-Null.tsv']
entities = [{'datatype': 'func', 'run': 1, 'session': 'postop', 'subject': '403', 'task': 'es'}, {'datatype': 'func', 'run': 2, 'session': 'postop', 'subject': '403', 'task': 'es'}, {'datatype': 'func', 'run': 3, 'session': 'postop', 'subject': '403', 'task': 'es'}, {'datatype': 'func', 'run': 4, 'session': 'postop', 'subject': '403', 'task': 'es'}, {'datatype': 'func', 'run': 5, 'session': 'postop', 'subject': '403', 'task': 'es'}, {'datatype': 'func', 'run': 6, 'session': 'postop', 'subject': '403', 'task': 'es'}]
fmri_prep = ['/home/william/sherlock/scratch/data/esfmri/derivatives/fmriprep/sub-403/ses-postop/func/sub-403_ses-postop_task-es_run-01_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/william/sherlock/scratch/data/esfmri/derivatives/fmriprep/sub-403/ses-postop/func/sub-403_ses-postop_task-es_run-02_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/william/sherlock/scratch/data/esfmri/derivatives/fmriprep/sub-403/ses-postop/func/sub-403_ses-postop_task-es_run-03_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/william/sherlock/scratch/data/esfmri/derivatives/fmriprep/sub-403/ses-postop/func/sub-403_ses-postop_task-es_run-04_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/william/sherlock/scratch/data/esfmri/derivatives/fmriprep/sub-403/ses-postop/func/sub-403_ses-postop_task-es_run-05_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/william/sherlock/scratch/data/esfmri/derivatives/fmriprep/sub-403/ses-postop/func/sub-403_ses-postop_task-es_run-06_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz']
fmri_prep_aroma = <undefined>
high_pass = 0.008
ica_aroma = False
low_pass = 0.08
output_dir = /tmp/fmridenoise/denoise
pipeline = <undefined>
smoothing = True
tr_dict = <undefined>

Traceback: 
Traceback (most recent call last):
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/plugins/linear.py", line 48, in run
    node.run(updatehash=updatehash)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 473, in run
    result = self._run_interface(execute=True)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 1269, in _run_interface
    self.config['execution']['stop_on_first_crash'])))
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 1144, in _collate_results
    for i, nresult, err in nodes:
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/utils.py", line 102, in nodelist_runner
    result = node.run(updatehash=updatehash)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 473, in run
    result = self._run_interface(execute=True)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface
    return self._run_command(execute)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 649, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 376, in run
    runtime = self._run_interface(runtime)
  File "/home/william/packages/fmridenoise/fmridenoise/interfaces/denoising.py", line 117, in _run_interface
    t_r=tr
  File "/home/william/anaconda3/lib/python3.6/site-packages/nilearn/image/image.py", line 981, in clean_img
    ensure_finite=ensure_finite)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nilearn/signal.py", line 528, in clean
    low_pass=low_pass, high_pass=high_pass)
  File "/home/william/anaconda3/lib/python3.6/site-packages/nilearn/signal.py", line 277, in butterworth
    timeseries[:] = sp_signal.filtfilt(b, a, timeseries)
  File "/home/william/anaconda3/lib/python3.6/site-packages/scipy/signal/signaltools.py", line 3126, in filtfilt
    ntaps=max(len(a), len(b)))
  File "/home/william/anaconda3/lib/python3.6/site-packages/scipy/signal/signaltools.py", line 3176, in _validate_pad
    "padlen, which is %d." % edge)
ValueError: The length of the input vector x must be at least padlen, which is 33.
wiheto commented 5 years ago

Actually, fmriprep has an ignore flag that works like:

[--ignore {fieldmaps,slicetiming,sbref} [{fieldmaps,slicetiming,sbref} ...]]

So how about it becomes:

--ignore {sub-02,run-3} [{task-test}]

Then all files that are matched in the BIDSLayout filter get removed?

kfinc commented 5 years ago

That's a great idea! What do you think @kbonna?

kbonna commented 5 years ago

I see two possible solutions of that problem. We can follow @wiheto advice and just filter BIGSGrab output to exclude files for which entity fields meet specified criteria. This is easy. We could also try more complex solution, and check length of all files and discard too short files.

However, I am wondering if dropping just one file, assuming multiple subjects and sessions, won't cause a problem in one of the next interfaces responsible for aggregating results to calculate quality measures. If that is the case, we should first allow for the possibility for single missing file by modifying these places, where files or measures are aggregated. For now, one workaround is to exclude subject by not including him as the argument passed to -participant flag.

I will think about this during the weekend, and then we can come up with the solution.

wiheto commented 5 years ago

Ok @kbonna and @kfinc. I won't do anything for now. Let me know if I can help once you think of a solution

wiheto commented 5 years ago

However, I am wondering if dropping just one file, assuming multiple subjects and sessions, won't cause a problem in one of the next interfaces responsible for aggregating results to calculate quality measures.

@kbonna I was thinking about this, one way to get around this is to add a -r/--run flag as well when the BIDS filtering is being done. There shouldn't be a problem downstream if files are rejected at the start (i.e. specified by the user).

However, if files are rejected later on based on some metric, then some update to the BIDS layout will be needed.