CCI-Bonn / HD-GLIO

Automated deep-learning based brain tumor segmentation on MRI
Apache License 2.0
26 stars 7 forks source link

FileNotFound hd_glio_params #1

Closed oneeq closed 3 years ago

oneeq commented 4 years ago

Dear all, I've got hd-glio running two weeks ago and due due to the nice results I just wanted to run it now forseveral cases. However, now it fails with the information/traceback below. The "checkpoint file" is not part of the downloaded zip. I already reinstalled hd-glio and nnUNet. Did not work. I am running this on a Linux Mint 19. Do you have any suggestions ?

Thank you very much, Patrick

Traceback (most recent call last): File "/usr/local/bin/hd_glio_predict", line 11, in load_entry_point('hd-glio==1.4', 'console_scripts', 'hd_glio_predict')() File "/home/nadine/.local/lib/python3.6/site-packages/hd_glio/hd_glio_predict.py", line 60, in main True) File "/usr/local/nnUNet/nnunet/inference/predict.py", line 179, in predict_cases trainer, params = load_model_and_checkpoint_files(model, folds, fp16=fp16, checkpoint_name=checkpoint_name) File "/usr/local/nnUNet/nnunet/training/model_restore.py", line 140, in load_model_and_checkpoint_files trainer = restore_model(join(folds[0], "%s.model.pkl" % checkpoint_name), fp16=fp16) File "/usr/local/nnUNet/nnunet/training/model_restore.py", line 56, in restore_model info = load_pickle(pkl_file) File "/home/nadine/.local/lib/python3.6/site-packages/batchgenerators/utilities/file_and_folder_operations.py", line 49, in load_pickle with open(file, mode) as f: FileNotFoundError: [Errno 2] No such file or directory: '/home/nadine/hd_glio_params/fold_0/model_final_checkpoint.model.pkl'

FabianIsensee commented 4 years ago

Hi Patrick, can you try this:

pip install nnunet==0.6

And let me know if that fixes it? Best, Fabian

FabianIsensee commented 4 years ago

Nevermind I changed some things. Please run pip install --upgrade hd_glio Best, Fabian

oneeq commented 4 years ago

Hi,

Thanks a lot. It’s running now and results look good. I had to copy the postprocessing folder to the nnunet folder under /home/USER/.local/ and so on, as the nnunet folder that i had installed under /usr/local/ was not recognised by hd-glioma. There is still a warning about the nnunet version.

However, I got now the following message concerning the postprocessing (... last lines from the predict output):

debug: mirroring True mirror_axes (0, 1, 2) step: 2 do mirror: True configuring tiles data shape: (1, 4, 137, 171, 149) patch size: [128 128 128] steps (x, y, and z): [64 73] [ 64 107] [64 85] number of tiles: 8 prediction on GPU done inference done. Now waiting for the segmentation export to finish... force_separate_z: None interpolation order: 3 no resampling necessary WARNING! Cannot run postprocessing because the postprocessing file is missing. Make sure to run consolidate_folds in the output folder of the model first! The folder you need to run this in is /home/nadine/hd_glio_params

I tried to run consolidate_folds in the folder but it returned the following message.

Traceback (most recent call last): File "consoli.py", line 3, in consolidate_folds('/home/nadine/hd_glio_params') File "/home/nadine/.local/lib/python3.6/site-packages/nnunet/postprocessing/consolidate_postprocessing.py", line 38, in consolidate_folds assert all([isdir(i) for i in folders_folds]), "some folds are missing" AssertionError: some folds are missing

Best regards, Patrick

Am 03.04.2020 um 09:33 schrieb Fabian Isensee notifications@github.com:

Nevermind I changed some things. Please run pip install --upgrade hd_glio Best, Fabian

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NeuroAI-HD/HD-GLIO/issues/1#issuecomment-608278382, or unsubscribe https://github.com/notifications/unsubscribe-auth/AI7625IZCE6TQODYIX3HB6DRKWGM3ANCNFSM4L2SVIXA.

FabianIsensee commented 4 years ago

Hi Patrick, you can ignore this warning. There is no postprocessing for hd-glio. I will need to address that once I can go back to my workstation :-) Glad to hear it worked now. Best, Fabian