nipreps / mriqc

Automated Quality Control and visual reports for Quality Assessment of structural (T1w, T2w) and functional MRI of the brain
http://mriqc.readthedocs.io
Apache License 2.0
299 stars 132 forks source link

mriqc: error: One or more participant labels were not found in the BIDS directory: #1118

Closed rbareja25 closed 8 months ago

rbareja25 commented 1 year ago

What happened?

I am running mriqc using this command 'singularity run --cleanenv --bind /brain_mri_bids/BIDS_SORTED:/data --bind /brain_mri_bids/MRIQC_OUTPUT:/out mriqc_latest_v2.sif /data /out participant --participant_label sub-ID1' I am getting the following error 'mriqc: error: One or more participant labels were not found in the BIDS directory: ID1' I have used DeepDicomSort to get my data predicted and sorted so its in the BIDS_SORTED folder which is a BIDS directory.

I have tried 3 different docker containers from here https://hub.docker.com/r/nipreps/mriqc/ but each of the three gave same issue. The tags that I have tried are '[latest], [23.1.0rc0], [23.0.1]' According to the neurostars post https://neurostars.org/t/mriqc-one-or-more-participant-labels-were-not-found-in-the-bids-directory-on-hpc-with-singularity/25309/2, mriqc with 23.0.1 should be able to solve this issue but I still get the error.

What command did you use?

singularity run --cleanenv --bind  /brain_mri_bids/BIDS_SORTED:/data --bind /brain_mri_bids/MRIQC_OUTPUT:/out mriqc_latest_v2.sif /data /out participant --participant_label ID1

What version of the software are you running?

singularity pull --name mriqc_latest_v2.sif docker://nipreps/mriqc:23.0.1

How are you running this software?

Singularity

Is your data BIDS valid?

Yes

Are you reusing any previously computed results?

Work directory

Please copy and paste any relevant log output.

mriqc version

Apptainer> mriqc -version
usage: mriqc [-h] [--version] [-v] [--species {human,rat}] [--participant-label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]] [--session-id [SESSION_ID ...]] [--run-id [RUN_ID ...]]
             [--task-id [TASK_ID ...]] [-m [MODALITIES ...]] [--dsname DSNAME] [--bids-database-dir PATH] [--nprocs NPROCS] [--omp-nthreads OMP_NTHREADS] [--mem MEMORY_GB] [--testing]
             [-f] [--pdb] [-w WORK_DIR] [--verbose-reports] [--reports-only] [--write-graph] [--dry-run] [--resource-monitor] [--use-plugin USE_PLUGIN] [--no-sub] [--email EMAIL]
             [--webapi-url WEBAPI_URL] [--webapi-port WEBAPI_PORT] [--upload-strict] [--notrack] [--ants-float] [--ants-settings ANTS_SETTINGS] [--ica] [--fft-spikes-detector]
             [--fd_thres FD_THRES] [--deoblique] [--despike] [--start-idx START_IDX] [--stop-idx STOP_IDX]
             bids_dir output_dir {participant,group} [{participant,group} ...]
mriqc: error: argument -v/--verbose: ignored explicit argument 'ersion'
Apptainer> mriqc --version
MRIQC v23.1.0rc0

Additional information / screenshots

No response

rbareja25 commented 1 year ago

In addition to above, I tried running now 'singularity run --cleanenv --bind /brain_mri_bids/BIDS_SORTED:/data --bind /brain_mri_bids/MRIQC_OUTPUT:/out mriqc_latest_v2.sif /data /out participant --participant_label **Cohort1**' The cases in bids folder are as Cohort1_ID1, Cohort1_ID2. Not sure if this is the correct way to run though.

Here is the log in this case:

Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/mriqc/engine/plugin.py", line 60, in run_node result["result"] = node.run(updatehash=updatehash) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run result = self._run_interface(execute=True) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface return self._run_command(execute) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command raise NodeExecutionError(msg) nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node PlotMosaicNoise.

Traceback: Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/nipype/interfaces/base/core.py", line 398, in run runtime = self._run_interface(runtime) File "/opt/conda/lib/python3.9/site-packages/nireports/interfaces/mosaic.py", line 146, in _run_interface plot_mosaic( File "/opt/conda/lib/python3.9/site-packages/nireports/reportlets/mosaic.py", line 558, in plot_mosaic img_data = np.moveaxis( File "<__array_function__ internals>", line 180, in moveaxis File "/opt/conda/lib/python3.9/site-packages/numpy/core/numeric.py", line 1460, in moveaxis source = normalize_axis_tuple(source, a.ndim, 'source') File "/opt/conda/lib/python3.9/site-packages/numpy/core/numeric.py", line 1391, in normalize_axis_tuple axis = tuple([normalize_axis_index(ax, ndim, argname) for ax in axis]) File "/opt/conda/lib/python3.9/site-packages/numpy/core/numeric.py", line 1391, in axis = tuple([normalize_axis_index(ax, ndim, argname) for ax in axis]) numpy.AxisError: source: axis 2 is out of bounds for array of dimension 2

Traceback (most recent call last): File "/opt/conda/bin/mriqc", line 8, in sys.exit(main()) File "/opt/conda/lib/python3.9/site-packages/mriqc/cli/run.py", line 168, in main mriqc_wf.run(**_plugin) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/workflows.py", line 638, in run runner.run(execgraph, updatehash=updatehash, config=self.config) File "/opt/conda/lib/python3.9/site-packages/mriqc/engine/plugin.py", line 184, in run self._clean_queue(jobid, graph, result=result) File "/opt/conda/lib/python3.9/site-packages/mriqc/engine/plugin.py", line 256, in _clean_queue raise RuntimeError("".join(result["traceback"])) RuntimeError: Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/mriqc/engine/plugin.py", line 60, in run_node result["result"] = node.run(updatehash=updatehash) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run result = self._run_interface(execute=True) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface return self._run_command(execute) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command raise NodeExecutionError(msg) nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node PlotMosaicNoise.

Traceback: Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/nipype/interfaces/base/core.py", line 398, in run runtime = self._run_interface(runtime) File "/opt/conda/lib/python3.9/site-packages/nireports/interfaces/mosaic.py", line 146, in _run_interface plot_mosaic( File "/opt/conda/lib/python3.9/site-packages/nireports/reportlets/mosaic.py", line 558, in plot_mosaic img_data = np.moveaxis( File "<__array_function__ internals>", line 180, in moveaxis File "/opt/conda/lib/python3.9/site-packages/numpy/core/numeric.py", line 1460, in moveaxis source = normalize_axis_tuple(source, a.ndim, 'source') File "/opt/conda/lib/python3.9/site-packages/numpy/core/numeric.py", line 1391, in normalize_axis_tuple axis = tuple([normalize_axis_index(ax, ndim, argname) for ax in axis]) File "/opt/conda/lib/python3.9/site-packages/numpy/core/numeric.py", line 1391, in axis = tuple([normalize_axis_index(ax, ndim, argname) for ax in axis]) numpy.AxisError: source: axis 2 is out of bounds for array of dimension 2

And if I run 'group' here is the issue:

Traceback (most recent call last): File "/opt/conda/bin/mriqc", line 8, in sys.exit(main()) File "/opt/conda/lib/python3.9/site-packages/mriqc/cli/run.py", line 234, in main raise Exception(messages.GROUP_NO_DATA) Exception: No data found. No group level reports were generated.

oesteban commented 1 year ago

Can you see a folder /brain_mri_bids/MRIQC_OUTPUT/.bids_db ? if so, please delete and retry with 23.1.0rc0. Please let us know the result because this may confirm a bug.

tpatpa commented 1 year ago

I encountered the same issue with docker MRIQC v23.0.0 and it was resolved with MRIQC v23.1.0rc0 . Thank you!

oesteban commented 8 months ago

Please reopen if more recent versions of MRIQC showcase this problem.