MIC-DKFZ / nnUNet

Apache License 2.0
5.95k stars 1.77k forks source link

"Could not find a dataset with ID..." #2527

Closed lara-bonney closed 1 month ago

lara-bonney commented 1 month ago

Hi,

I think this is similar to this previous issue (#2405) which was closed without a resolution, and would appreciate any advice you have on getting this to work. Essentially I'm confident I've set the environment variables correctly, and the data is in the correct format, however, I repeatedly get the following error:

"RuntimeError: Could not find a dataset with the ID 1. Make sure the requested dataset ID exists and that nnU-Net knows where raw and preprocessed data are located (see Documentation - Installation). Here are your currently defined folders:"

I've tried the following Python code to debug, and I'm increasingly confused as to where the error is coming from: code:

from nnunetv2.utilities.dataset_name_id_conversion import find_candidate_datasets
import inspect
from batchgenerators.utilities.file_and_folder_operations import subdirs
import os
print(f"nnUNet_raw: {os.getenv('nnUNet_raw')}")
print(f"nnUNet_preprocessed: {os.getenv('nnUNet_preprocessed')}")
print(f"nnUNet_results: {os.getenv('nnUNet_results')}")

print(os.path.isdir(os.getenv('nnUNet_raw')))
print(os.path.isdir(os.getenv('nnUNet_preprocessed')))
print(os.path.isdir(os.getenv('nnUNet_results')))

nnUNet_raw = os.getenv('nnUNet_raw')
raw_subdirs = subdirs(nnUNet_raw, prefix="Dataset_001", join=False)
print(f"Raw subdirectories found: {raw_subdirs}")

# Try to find datasets
candidates = find_candidate_datasets(1)
print(f"Candidates found: {candidates}")

output

nnUNet_raw: /nnUNet_data/nnUNet_raw nnUNet_preprocessed: /nnUNet_data/nnUNet_preprocessed nnUNet_results: /nnUNet_data/nnUNet_results True True True Raw subdirectories found: ['Dataset_001'] Candidates found: []

Any advice would be greatly appreciated.

Many thanks

gaojh135 commented 1 month ago

It should be named Dataset001_xxx.

lara-bonney commented 1 month ago

Thank you so much for the quick reply, sorry for the silly mistake!! This works now.