SuperMedIntel / Medical-SAM-Adapter

Adapting Segment Anything Model for Medical Image Segmentation
GNU General Public License v3.0
1.02k stars 90 forks source link

Fix multiprocessing crash -> allow validation to be invoked standalone #115

Closed dzenanz closed 5 months ago

dzenanz commented 5 months ago

The error was:

Traceback (most recent call last):
  File "C:\Program Files\Python39\lib\multiprocessing\spawn.py", line 125, in _main
    prepare(preparation_data)
  File "C:\Program Files\Python39\lib\multiprocessing\spawn.py", line 236, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Program Files\Python39\lib\multiprocessing\spawn.py", line 287, in _fixup_main_from_path
    main_content = runpy.run_path(main_path,
  File "C:\Program Files\Python39\lib\runpy.py", line 288, in run_path
    return _run_module_code(code, init_globals, run_name,
  File "C:\Program Files\Python39\lib\runpy.py", line 97, in _run_module_code
    _run_code(code, mod_globals, init_globals,
  File "C:\Program Files\Python39\lib\runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "M:\Dev\CXR\Medical-SAM-Adapter\val.py", line 95, in <module>
    tol, (eiou, edice) = function.validation_sam(args, nice_test_loader, start_epoch, net)
  File "M:\Dev\CXR\Medical-SAM-Adapter\function.py", line 244, in validation_sam
    for ind, pack in enumerate(val_loader):
  File "M:\Dev\CXR\.venv\lib\site-packages\torch\utils\data\dataloader.py", line 439, in __iter__
    return self._get_iterator()
  File "M:\Dev\CXR\.venv\lib\site-packages\torch\utils\data\dataloader.py", line 387, in _get_iterator
    return _MultiProcessingDataLoaderIter(self)
  File "M:\Dev\CXR\.venv\lib\site-packages\torch\utils\data\dataloader.py", line 1040, in __init__
    w.start()
  File "C:\Program Files\Python39\lib\multiprocessing\process.py", line 121, in start
    self._popen = self._Popen(self)
  File "C:\Program Files\Python39\lib\multiprocessing\context.py", line 224, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Program Files\Python39\lib\multiprocessing\context.py", line 327, in _Popen
    return Popen(process_obj)
  File "C:\Program Files\Python39\lib\multiprocessing\popen_spawn_win32.py", line 45, in __init__
    prep_data = spawn.get_preparation_data(process_obj._name)
  File "C:\Program Files\Python39\lib\multiprocessing\spawn.py", line 154, in get_preparation_data
    _check_not_importing_main()
  File "C:\Program Files\Python39\lib\multiprocessing\spawn.py", line 134, in _check_not_importing_main
    raise RuntimeError('''
RuntimeError:
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.
WuJunde commented 5 months ago

@shinning0821 Would there be any problem?

shinning0821 commented 5 months ago

@shinning0821 Would there be any problem?

I think the modifications to the code are reasonable, but it seems that I have not encountered this error before, and the modification does not seem to solve the issue with multiprocessing completely.

WuJunde commented 5 months ago

@shinning0821 Would there be any problem?

I think the modifications to the code are reasonable, but it seems that I have not encountered this error before, and the modification does not seem to solve the issue with multiprocessing completely.

Thanks, do you think we should merge it in?

shinning0821 commented 5 months ago

@shinning0821 Would there be any problem?

I think the modifications to the code are reasonable, but it seems that I have not encountered this error before, and the modification does not seem to solve the issue with multiprocessing completely.

Thanks, do you think we should merge it in?

yes, we can merge it.