poldracklab / tacc-openneuro

0 stars 1 forks source link

ds002242: numpy.core._exceptions._ArrayMemoryError #36

Open jbwexler opened 1 year ago

jbwexler commented 1 year ago

Many subjects in this dataset produced similar though not always identical errors.

Node: mriqc_wf.funcMRIQC.ReportsWorkflow.BigPlot Working directory: /scratch1/03201/jbwexler/work_dir/mriqc/ds002242_sub-3816/mriqc_wf/funcMRIQC/ReportsWorkflow/_infile..scratch1..03201..jbwexler..openneuro_derivatives..derivatives..mriqc..ds002242-mriqc..sourcedata..raw..sub-3816..func..sub-3816_task-stt_run-1_bold.nii.gz/BigPlot

Node inputs:

drop_trs = 0 dvars = fd = fd_thres = 0.2 in_func = in_segm = in_spikes_bg = outliers = tr = None

Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/mriqc/engine/plugin.py", line 60, in run_node result["result"] = node.run(updatehash=updatehash) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 524, in run result = self._run_interface(execute=True) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 642, in _run_interface return self._run_command(execute) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 750, in _run_command raise NodeExecutionError( nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node BigPlot.

Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/nipype/interfaces/base/core.py", line 398, in run runtime = self._run_interface(runtime) File "/opt/conda/lib/python3.9/site-packages/niworkflows/interfaces/plotting.py", line 96, in _run_interface _nifti_timeseries(input_data, seg_file) File "/opt/conda/lib/python3.9/site-packages/niworkflows/utils/timeseries.py", line 64, in _nifti_timeseries data = dataset.get_fdata(dtype="float32").reshape((-1, dataset.shape[-1])) File "/opt/conda/lib/python3.9/site-packages/nibabel/dataobj_images.py", line 355, in get_fdata data = np.asanyarray(self._dataobj, dtype=dtype) File "/opt/conda/lib/python3.9/site-packages/nibabel/arrayproxy.py", line 391, in array arr = self._get_scaled(dtype=dtype, slicer=()) File "/opt/conda/lib/python3.9/site-packages/nibabel/arrayproxy.py", line 360, in _get_scaled scaled = scaled.astype(np.promote_types(scaled.dtype, dtype), copy=False) numpy.core._exceptions._ArrayMemoryError: Unable to allocate 2.31 GiB for an array with shape (100, 100, 72, 863) and data type float32