poldracklab / tacc-openneuro

0 stars 1 forks source link

ds004007-mriqc #41

Open jbwexler opened 1 year ago

jbwexler commented 1 year ago

For several subjects in the dataset. Is the solution simply to reduce the number of subjects per node?

Node: mriqc_wf.funcMRIQC.compute_tsnr Working directory: /scratch1/03201/jbwexler/work_dir/mriqc/ds004007_sub-01/mriqc_wf/funcMRIQC/_infile..scr atch1..03201..jbwexler..openneuro_derivatives..derivatives..mriqc..ds004007-mriqc..sourcedata..raw..sub-01..func..sub-01_task-listen_run-01_bold.nii.gz/compute_tsnr

Node inputs:

detrended_file = detrend.nii.gz in_file = ['/scratch1/03201/jbwexler/work_dir/mriqc/ds004007_sub-01/mriqc_wf/funcMRIQC/fMRI_HMC/_infile..scratch1..03201..jbwexler..openneuro_derivatives..derivatives..mriqc..ds004007-mriqc..sourcedata..raw..sub-01..func..sub-01_task-listen_run-01_bold.nii.gz/motion_correct/sub-01_task-listen_run-01_bold_valid_volreg.nii.gz'] mean_file = mean.nii.gz regress_poly = stddev_file = stdev.nii.gz tsnr_file = tsnr.nii.gz

Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/mriqc/engine/plugin.py", line 60, in run_node result["result"] = node.run(updatehash=updatehash) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 524, in run result = self._run_interface(execute=True) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 642, in _run_interface return self._run_command(execute) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 750, in _run_command raise NodeExecutionError( nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node compute_tsnr.

Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/nipype/interfaces/base/core.py", line 398, in run runtime = self._run_interface(runtime) File "/opt/conda/lib/python3.9/site-packages/nipype/algorithms/confounds.py", line 927, in _run_interface data = np.nan_to_num(data) File "<__array_function__ internals>", line 180, in nan_to_num File "/opt/conda/lib/python3.9/site-packages/numpy/lib/type_check.py", line 516, in nan_to_num idx_posinf = isposinf(d) File "<__array_function__ internals>", line 180, in isposinf File "/opt/conda/lib/python3.9/site-packages/numpy/lib/ufunclike.py", line 53, in func return f(x, out=out, **kwargs) File "/opt/conda/lib/python3.9/site-packages/numpy/lib/ufunclike.py", line 196, in isposinf return nx.logical_and(is_inf, signbit, out) numpy.core._exceptions._ArrayMemoryError: Unable to allocate 1.83 GiB for an array with shape (104, 104, 72, 2520) and data type bool

effigies commented 1 year ago

So this is a 7GB data array when loaded onto disk. I think reducing the number of simultaneous subjects is probably necessary for this dataset.