QTIM-Lab / DeepNeuro

A deep learning python package for neuroimaging data. Made by:
https://qtim-lab.github.io
MIT License
124 stars 36 forks source link

PermissionError Errno 13 - docker - Brain metastases segmentation #55

Open giemmecci opened 5 years ago

giemmecci commented 5 years ago

Hi, I'm getting the following error while trying to run the metastases segmentation via docker (nvidia-docker):

File loading completed.
('Starting New Case...',)
('Enhancing Mets Prediction',)
('======================',)
('Working on image.. ', '/INPUT_DATA/Output_Folder')
('Working on Preprocessor:', 'Conversion')
('Working on Preprocessor:', 'N4BiasCorrection')
('Working on Preprocessor:', 'Registration')
('Working on Preprocessor:', 'ZeroMeanNormalization')
('Predicting patch set', '1/3...')
('Predicting patch set', '2/3...')
('Predicting patch set', '3/3...')
('Working on Preprocessor:', 'SkullStrip_Model')
('Working on Preprocessor:', 'ZeroMeanNormalization')
('Predicting patch set', '1/8...')
('Predicting patch set', '2/8...')
('Predicting patch set', '3/8...')
('Predicting patch set', '4/8...')
('Predicting patch set', '5/8...')
('Predicting patch set', '6/8...')
('Predicting patch set', '7/8...')
('Predicting patch set', '8/8...')
Using TensorFlow backend.
Traceback (most recent call last):
  File "/usr/local/bin/segment_mets", line 11, in <module>
    load_entry_point('deepneuro', 'console_scripts', 'segment_mets')()
  File "/home/DeepNeuro/deepneuro/pipelines/Segment_Brain_Mets/cli.py", line 91, in main
    Segment_Mets_cli()
  File "/home/DeepNeuro/deepneuro/pipelines/shared.py", line 22, in __init__
    self.load()
  File "/home/DeepNeuro/deepneuro/pipelines/Segment_Brain_Mets/cli.py", line 20, in load
    super(Segment_Mets_cli, self).load()
  File "/home/DeepNeuro/deepneuro/pipelines/shared.py", line 48, in load
    getattr(self, args.command)()
  File "/home/DeepNeuro/deepneuro/pipelines/Segment_Brain_Mets/cli.py", line 87, in pipeline
    quiet=args.quiet)
  File "/home/DeepNeuro/deepneuro/pipelines/Segment_Brain_Mets/predict.py", line 108, in predict_brain_mets
    data_collection.clear_preprocessor_outputs()
  File "/home/DeepNeuro/deepneuro/data/data_collection.py", line 650, in clear_preprocessor_outputs
    preprocessor.clear_outputs(self)
  File "/home/DeepNeuro/deepneuro/preprocessing/preprocessor.py", line 179, in clear_outputs
    os.remove(output_filename)
PermissionError: [Errno 13] Permission denied: '/INPUT_DATA/pn-0372_hdglio/T1.nii'

Here's the command I run:

nvidia-docker run --rm -v /home/user/Desktop/tricktest/met:/INPUT_DATA qtimlab/deepneuro_segment_mets segment_mets pipeline -T1 /INPUT_DATA/pn-0372_hdglio/T1.nii -T1POST /INPUT_DATA/pn-0372_hdglio/CT1.nii -FLAIR /INPUT_DATA/pn-0372_hdglio/FLAIR.nii -T2 /INPUT_DATA/pn-0372_hdglio/T2.nii -output_folder /INPUT_DATA/Output_Folder -gpu_num 5

I don't know if this may be relevant, but I do not have sudo privileges on my system.

Thanks

UPDATE: I was able to go one step ahead using chmod 777 on the directories I'm using, but I'm still getting the same error. This time I'm also getting some output in the folders (see attached files), but I'm not sure if something is missing.

thanks! Screenshot from 2019-10-03 12-39-01

giemmecci commented 5 years ago

[solved]: one of the mounted directories wasn't getting the chmod 777.

In brief, ensuring chmod 777 to all the directories involved will solve the issue.

Thanks

giemmecci commented 5 years ago

I've just noticed that after running the program I don't have the original files available anymore and only got the outputs (see picture). Is this an expected behavior or is it due to the chmod 777?

Thanks Screenshot from 2019-10-03 14-37-27