Closed psychelzh closed 2 weeks ago
Since XCP-D ingests derivative datasets, which aren't supported by the validator, it doesn't run BIDS validation. Are you seeing something in your XCP-D log to indicate that it is running the bids validator?
Yes, it says Name
is missing from 'dataset_description.json'.
Ah, I see. That kind of validation is minimal, and there are things that XCP-D still needs in order to work, including a valid dataset_description.json file in the preprocessed dataset.
Got it. And I found that dataset_description.json
uses name
instead of Name
. Thanks! BTW, is it possible to make XCP-D work if dataset_description.json
is invalid.
The info in that file is still necessary for XCP-D, so I don't think it's a good idea to not validate it.
Hi Taylor,
My issue is different but also related to BIDS validation.
I use xcp_d-0.9.0
and I am getting the following error:
Traceback (most recent call last):
File "/usr/local/miniconda/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/usr/local/miniconda/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/miniconda/lib/python3.10/site-packages/xcp_d/cli/workflow.py", line 99, in build_workflow
retval["workflow"] = init_xcpd_wf()
File "/usr/local/miniconda/lib/python3.10/site-packages/xcp_d/workflows/base.py", line 78, in init_xcpd_wf
single_subject_wf = init_single_subject_wf(subject_id)
File "/usr/local/miniconda/lib/python3.10/site-packages/xcp_d/workflows/base.py", line 124, in init_single_subject_wf
subj_data = collect_data(
File "/usr/local/miniconda/lib/python3.10/site-packages/xcp_d/utils/bids.py", line 268, in collect_data
raise FileNotFoundError("No T1w or T2w files found.")
FileNotFoundError: No T1w or T2w files found.
When the layout is initilized somewhere in xcp_d
, apparently, it doesn't include any subject directories in the index due to validate = True
(The derivatives were downloaded via NDA from ABCC). When I set it to false, it recognizes the T1w scan:
>>> layout = BIDSLayout("/gpfs3/well/margulies/projects/data/ABCD/fmriresults01/derivatives/fmriprep", validate = False)
>>> layout.get(return_type="file", subject=participant_label, suffix = "T1w")
>>> ['/gpfs3/well/margulies/projects/data/ABCD/fmriresults01/derivatives/fmriprep/sub-NDARINVMENCHP62/ses-baselineYear1Arm1/anat/sub-NDARINVMENCHP62_ses-baselineYear1Arm1_space-MNI152NLin2009cAsym_res-2_desc-preproc_T1w.nii.gz']
I was wondering if adding --skip-bids-validation
would be warranted in this case? For now I don't know how to circumvent this. Here's my xcp_d
command:
singularity exec -B $MRIS -B $HOME -B $SIMG $SIMG/xcp_d-0.9.0.simg /usr/local/miniconda/bin/xcp_d \
$MRIS \
$XCPD_OUT \
participant \
--mode abcd \
--input-type fmriprep \
--motion-filter-type none \
--participant_label $sub \
--work-dir $XCPD_work \
-p 27P \
-t rest \
--file-format nifti \
-f 0.3 \
--lower-bpf 0.009 \
--upper-bpf 0.08 \
--fs-license-file $FS_LICENSE \
--stop-on-first-crash \
--warp-surfaces-native2std n \
--atlases '4S456Parcels' \
-vvv
The BIDSLayout object that is used to find T1w and T2w scans has validation disabled (see the code snippet below), so XCP-D must not be finding the files for another reason. We can dig into it, but I'd recommend opening a separate issue or a NeuroStars topic for your problem. https://github.com/PennLINC/xcp_d/blob/ddd7a559730f0fc841a97b0889bf3d07a183ec5c/xcp_d/config.py#L473-L484
This might be standard command line options, and it is very useful when dataset is slightly modifed.