Traceback (most recent call last):
File "/opt/conda/lib/python3.9/site-packages/mriqc/engine/plugin.py", line 60, in run_node
result["result"] = node.run(updatehash=updatehash)
File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 524, in run
result = self._run_interface(execute=True)
File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 642, in _run_interface
return self._run_command(execute)
File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 750, in _run_command
raise NodeExecutionError(
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node compute_tsnr.
Traceback (most recent call last):
File "/opt/conda/lib/python3.9/site-packages/nipype/interfaces/base/core.py", line 398, in run
runtime = self._run_interface(runtime)
File "/opt/conda/lib/python3.9/site-packages/nipype/algorithms/confounds.py", line 927, in _run_interface
data = np.nan_to_num(data)
File "<__array_function__ internals>", line 180, in nan_to_num
File "/opt/conda/lib/python3.9/site-packages/numpy/lib/type_check.py", line 516, in nan_to_num
idx_posinf = isposinf(d)
File "<__array_function__ internals>", line 180, in isposinf
File "/opt/conda/lib/python3.9/site-packages/numpy/lib/ufunclike.py", line 53, in func
return f(x, out=out, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/numpy/lib/ufunclike.py", line 196, in isposinf
return nx.logical_and(is_inf, signbit, out)
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 1.83 GiB for an array with shape (104, 104, 72,
2520) and data type bool
For several subjects in the dataset. Is the solution simply to reduce the number of subjects per node?