poldracklab / tacc-openneuro

0 stars 1 forks source link

ds002685-mriqc: "Exception raised while executing Node _sanitize0" #89

Open jbwexler opened 2 months ago

jbwexler commented 2 months ago

All subjects:

Node: mriqc_wf.funcMRIQC.sanitize
Working directory: /node_tmp/work_dir/mriqc/ds002685_sub-08/mriqc_wf/funcMRIQC/8c213d6580f1ee439b9caa17073ed93405803422/sanitize

Node inputs:

in_file = ['/scratch1/03201/jbwexler/openneuro_derivatives/derivatives/mriqc/ds002685-mriqc/sourcedata/raw/sub-08/ses-13/func/sub-08_ses-13_task-MTTNS_dir-ap_run-03_bold.nii.gz']
max_32bit = True
n_volumes_to_discard = 0

Traceback (most recent call last):
  File "/opt/conda/lib/python3.11/site-packages/mriqc/engine/plugin.py", line 64, in run_node
    result['result'] = node.run(updatehash=updatehash)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 1380, in _run_interface
    result = self._collate_results(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 1249, in _collate_results
    for i, nresult, err in nodes:
  File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/utils.py", line 94, in nodelist_runner
    result = node.run(updatehash=updatehash)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
    result = self._run_interface(execute=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
    return self._run_command(execute)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
    raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node _sanitize0.

Traceback:
        Traceback (most recent call last):
          File "/opt/conda/lib/python3.11/site-packages/nipype/interfaces/base/core.py", line 397, in run
            runtime = self._run_interface(runtime)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          File "/opt/conda/lib/python3.11/site-packages/niworkflows/interfaces/header.py", line 527, in _run_interface
            in_data[:, :, :, self.inputs.n_volumes_to_discard:],
            ~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          File "/opt/conda/lib/python3.11/site-packages/nibabel/arrayproxy.py", line 463, in __getitem__
            return self._get_scaled(dtype=None, slicer=slicer)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          File "/opt/conda/lib/python3.11/site-packages/nibabel/arrayproxy.py", line 424, in _get_scaled
            scaled = apply_read_scaling(self._get_unscaled(slicer=slicer), scl_slope, scl_inter)
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          File "/opt/conda/lib/python3.11/site-packages/nibabel/arrayproxy.py", line 403, in _get_unscaled
            return fileslice(
                   ^^^^^^^^^^
          File "/opt/conda/lib/python3.11/site-packages/nibabel/fileslice.py", line 779, in fileslice
            arr_data = read_segments(fileobj, segments, n_bytes, lock)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          File "/opt/conda/lib/python3.11/site-packages/nibabel/fileslice.py", line 671, in read_segments
            raise ValueError('Whoops, not enough data in file')
        ValueError: Whoops, not enough data in file