nipreps / mriqc

Automated Quality Control and visual reports for Quality Assessment of structural (T1w, T2w) and functional MRI of the brain
http://mriqc.readthedocs.io
Apache License 2.0
294 stars 130 forks source link

TypeError: expected str, bytes or os.PathLike object, not list #1282

Closed DVSneuro closed 5 months ago

DVSneuro commented 5 months ago

What happened?

I'm getting TypeError: expected str, bytes or os.PathLike object, not list on the latest version of MRIQC. Data is available here: https://openneuro.org/datasets/ds005085

Thanks for any help!

What command did you use?

TEMPLATEFLOW_DIR=/ZPOOL/data/tools/templateflow
export APPTAINERENV_TEMPLATEFLOW_HOME=/opt/templateflow
singularity run --cleanenv \
-B ${TEMPLATEFLOW_DIR}:/opt/templateflow \
-B $maindir:/base \
-B $scratchdir:/scratch \
/ZPOOL/data/tools/mriqc-24.0.0.simg \
/base /base/derivatives/mriqc \
participant --participant_label $sub \
-w /scratch

What version of the software are you running?

24.0.0

How are you running this software?

Singularity

Is your data BIDS valid?

Yes

Are you reusing any previously computed results?

No

Please copy and paste any relevant log output.

(base) tug87422@cla18994:/ZPOOL/data/projects/ds005085$ bash code/mriqc.sh 10043
------------------------------------------------------------------
  Running MRIQC version 24.1.0.dev0+g3fe90466.d20240417
  ----------------------------------------------------------------
  * BIDS dataset path: /base.
  * Output folder: /base/derivatives/mriqc.
  * Analysis levels: ['participant'].
------------------------------------------------------------------

2024-04-17 12:29:12 | IMPORTANT | mriqc            | Building MRIQC's workflows...
2024-04-17 12:29:13 | IMPORTANT | mriqc            | DataLad dataset identified, attempting to `datalad get` unavailable files.
Process Process-2:
Traceback (most recent call last):
  File "/opt/conda/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/opt/conda/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/conda/lib/python3.11/site-packages/mriqc/cli/workflow.py", line 56, in build_workflow
    retval['workflow'] = init_mriqc_wf()
                         ^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/mriqc/workflows/core.py", line 48, in init_mriqc_wf
    workflow.add_nodes([fmri_qc_workflow()])
                        ^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/mriqc/workflows/functional/base.py", line 85, in fmri_qc_workflow
    _datalad_get(dataset)
  File "/opt/conda/lib/python3.11/site-packages/mriqc/utils/misc.py", line 257, in _datalad_get
    return get(
           ^^^^
  File "/opt/conda/lib/python3.11/site-packages/datalad/interface/base.py", line 773, in eval_func
    return return_func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/datalad/interface/base.py", line 763, in return_func
    results = list(results)
              ^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/datalad_next/patches/interface_utils.py", line 218, in _execute_command_
    for r in _process_results(
  File "/opt/conda/lib/python3.11/site-packages/datalad/interface/utils.py", line 319, in _process_results
    for res in results:
  File "/opt/conda/lib/python3.11/site-packages/datalad/distribution/get.py", line 902, in __call__
    for sdsres in Subdatasets.__call__(
  File "/opt/conda/lib/python3.11/site-packages/datalad_next/patches/interface_utils.py", line 218, in _execute_command_
    for r in _process_results(
  File "/opt/conda/lib/python3.11/site-packages/datalad/interface/utils.py", line 319, in _process_results
    for res in results:
  File "/opt/conda/lib/python3.11/site-packages/datalad/local/subdatasets.py", line 268, in __call__
    contains = resolve_path(ensure_list(contains), dataset, ds)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/datalad/distribution/dataset.py", line 661, in resolve_path
    p = ut.Path(p)
        ^^^^^^^^^^
  File "/opt/conda/lib/python3.11/pathlib.py", line 871, in __new__
    self = cls._from_parts(args)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/pathlib.py", line 509, in _from_parts
    drv, root, parts = self._parse_args(args)
                       ^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/pathlib.py", line 493, in _parse_args
    a = os.fspath(a)
        ^^^^^^^^^^^^
TypeError: expected str, bytes or os.PathLike object, not list

Additional information / screenshots

Here's the contents of the config file. I have no idea why the version number for MRIQC is reported like it is below. For reference, I created the container using the following line of code: singularity build mriqc-24.0.0.simg docker://nipreps/mriqc:24.0.0

[environment]
cpu_count = 48
exec_env = "singularity"
free_mem = 28.6
freesurfer_home = "/opt/freesurfer"
overcommit_policy = "heuristic"
overcommit_limit = "50%"
nipype_version = "1.8.6"
synthstrip_path = "PosixPath('/opt/freesurfer/models/synthstrip.1.pt')"
templateflow_version = "24.2.0"
total_memory = 125.58417892456055
version = "24.1.0.dev0+g3fe90466.d20240417"

[execution]
ants_float = false
bids_dir = "/base"
bids_dir_datalad = true
bids_database_dir = "/scratch/.bids_db-20240417-122835_78905bae-33d6-43cf-bacb-acb86d13225d"
bids_database_wipe = false
cwd = "/home/tug87422@tu.temple.edu"
datalad_get = true
debug = false
dry_run = false
dsname = "<unset>"
float32 = true
layout = "BIDS Layout: /base"
log_dir = "/base/derivatives/mriqc/logs"
log_level = 25
modalities = [ "T1w", "T2w", "bold", "dwi",]
no_sub = false
notrack = false
output_dir = "/base/derivatives/mriqc"
participant_label = [ "10043",]
pdb = false
reports_only = false
resource_monitor = false
run_uuid = "20240417-122835_78905bae-33d6-43cf-bacb-acb86d13225d"
templateflow_home = "/templateflow"
upload_strict = false
verbose_reports = false
webapi_url = "https://mriqc.nimh.nih.gov:443/api/v1"
work_dir = "/scratch"
write_graph = false

[workflow]
analysis_level = [ "participant",]
biggest_file_gb = 2.648588978346189e-10
deoblique = false
despike = false
fd_thres = 0.2
fd_radius = 50
fft_spikes_detector = false
min_len_dwi = 7
min_len_bold = 5
species = "human"
template_id = "MNI152NLin2009cAsym"

[nipype]
crashfile_format = "txt"
get_linked_libs = false
local_hash_check = true
nprocs = 48
omp_nthreads = 1
plugin = "MultiProc"
remove_node_directories = false
resource_monitor = false
stop_on_first_crash = true

[settings]
file_path = "/scratch/config-20240417-122835_78905bae-33d6-43cf-bacb-acb86d13225d.toml"
start_time = 1713371315.80771

[execution.bids_filters]

[workflow.inputs]
t1w = [ "/base/sub-10043/anat/sub-10043_T1w.nii.gz",]
bold = [ "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb1me1_bold.nii.gz", [ "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb1me4_echo-1_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb1me4_echo-2_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb1me4_echo-3_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb1me4_echo-4_part-mag_bold.nii.gz",], "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb3me1_bold.nii.gz", [ "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb3me4_echo-1_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb3me4_echo-2_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb3me4_echo-3_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb3me4_echo-4_part-mag_bold.nii.gz",], "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb6me1_bold.nii.gz", [ "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb6me4_echo-1_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb6me4_echo-2_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb6me4_echo-3_part-mag_bold.nii.gz", "/base/sub-10043/func/sub-10043_task-sharedreward_acq-mb6me4_echo-4_part-mag_bold.nii.gz",],]

[nipype.plugin_args]
maxtasksperchild = 1
raise_insufficient = false
oesteban commented 5 months ago

Good catch - for the moment, you can ensure you did datalad get on all images and use --no-datalad-get for your mriqc call. I'll address this asap

The problem: This dataset has multi-echo images and the list should be flattened before passing to datalad for get.

The "other" problem: I think #1120 would also fail for this.

DVSneuro commented 5 months ago

Thanks, Oscar. I tried that, and I'm still getting a similar error when using the --no-datalad-get option (see below). I also get a similar issue with the data data that hasn't been pulled from datalad.

(base) tug87422@cla18994:/ZPOOL/data/projects/ds005085$ bash code/mriqc.sh 10043
------------------------------------------------------------------
  Running MRIQC version 24.1.0.dev0+g3fe90466.d20240417
  ----------------------------------------------------------------
  * BIDS dataset path: /base.
  * Output folder: /base/derivatives/mriqc.
  * Analysis levels: ['participant'].
------------------------------------------------------------------

2024-04-18 07:25:36 | IMPORTANT | mriqc            | Building MRIQC's workflows...
Process Process-2:
Traceback (most recent call last):
  File "/opt/conda/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/opt/conda/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/conda/lib/python3.11/site-packages/mriqc/cli/workflow.py", line 56, in build_workflow
    retval['workflow'] = init_mriqc_wf()
                         ^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/mriqc/workflows/core.py", line 48, in init_mriqc_wf
    workflow.add_nodes([fmri_qc_workflow()])
                        ^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/mriqc/workflows/functional/base.py", line 90, in fmri_qc_workflow
    bold_len = nb.load(bold_path).shape[3]
               ^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nibabel/loadsave.py", line 96, in load
    filename = _stringify_path(filename)
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/nibabel/filename_parser.py", line 41, in _stringify_path
    return pathlib.Path(filepath_or_buffer).expanduser().as_posix()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/pathlib.py", line 871, in __new__
    self = cls._from_parts(args)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/pathlib.py", line 509, in _from_parts
    drv, root, parts = self._parse_args(args)
                       ^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/pathlib.py", line 493, in _parse_args
    a = os.fspath(a)
        ^^^^^^^^^^^^
TypeError: expected str, bytes or os.PathLike object, not list
oesteban commented 5 months ago

Oh yes, you need to run datalad get on your own -- the actual data are not there!

The --no-datalad-get disables fetching the data with datalad.

DVSneuro commented 5 months ago

Thanks! The weird thing here is that all the data are already there on the local computer, and I get the same TypeError when running the same dataset that was uploaded to OpenNeuro in the first place. So, I'm confused!

(base) tug87422@cla18994:/ZPOOL/data/projects/ds005085$ datalad get . *
action summary:
  get (notneeded: 24)

I suspect it will replicate, but let me know if you need the config file again. It looks the same to me except it says datalad_get = false.

Thanks for any help, and sorry if our dataset is a weird edge case!

Best wishes, David

oesteban commented 5 months ago

@DVSneuro -- sorry I didn't read your message (https://github.com/nipreps/mriqc/issues/1282#issuecomment-2063647205) with enough depth in my first pass.

Indeed, with --no-datalad-get you got around the first issue:

This dataset has multi-echo images and the list should be flattened before passing to datalad for get.

But you hit the other issue:

The "other" problem: I think https://github.com/nipreps/mriqc/pull/1120 would also fail for this.

A fix should be coming in soon.

DVSneuro commented 5 months ago

Thanks @oesteban! And no worries! I appreciate all the support with these tools, and we're excited to use the latest version of MRIQC on our data. Thanks again for providing these tools for the community!