LuxImagingAI / DBSegment

This is a deep learning-based method to segment deep brain structures and a brain mask from T1 weighted MRI.
GNU General Public License v3.0
11 stars 6 forks source link

failed run in virtual environment #3

Closed dcoynel closed 1 year ago

dcoynel commented 1 year ago

Hi, thanks for this very nice tool!

I wanted to test it, and installed it through pip as described in your README.

When running the standard command DBSegment -i input -o output, I receive the following segmentation fault error.

Could you please let me know how I could debug this issue? I am running DBSegment on an Intel macbook with macos 13.1. So there is no GPU, but I assumed CPU-only would work?

Best, David

Pre-processing: 1500_t1.nii Model exists. Segmenting.

Please cite the following papers when using DBSegment:

Mehri Baniasadi, Mikkel V. Petersen, Jorge Goncalves, Andreas Horn, Vanja Vlasov, Frank Hertel, Andreas Husch, DBSegment: Fast and robust segmentation of deep brain structures- Evaluation of transportability across acquisition domains.

Isensee, F., Jaeger, P.F., Kohl, S.A.A. et al. "nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation." Nat Methods (2020). https://doi.org/10.1038/s41592-020-01008-z

Please cite the following papers when using DBSegment:

Mehri Baniasadi, Mikkel V. Petersen, Jorge Goncalves, Andreas Horn, Vanja Vlasov, Frank Hertel, Andreas Husch, DBSegment: Fast and robust segmentation of deep brain structures- Evaluation of transportability across acquisition domains.

Isensee, F., Jaeger, P.F., Kohl, S.A.A. et al. "nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation." Nat Methods (2020). https://doi.org/10.1038/s41592-020-01008-z

Traceback (most recent call last): File "/Users/dcoynel/Desktop/dbsegment/env/bin/DBSegment", line 8, in sys.exit(main()) File "/Users/dcoynel/Desktop/dbsegment/env/lib/python3.10/site-packages/DBSegment/DBSegment.py", line 597, in main main_infer() File "/Users/dcoynel/Desktop/dbsegment/env/lib/python3.10/site-packages/DBSegment/DBSegment.py", line 501, in main_infer inference(parser) File "/Users/dcoynel/Desktop/dbsegment/env/lib/python3.10/site-packages/DBSegment/DBSegment.py", line 357, in inference predict_from_folder(model_folder_name, input_folder, output_folder, folds, save_npz, num_threads_preprocessing, File "/Users/dcoynel/Desktop/dbsegment/env/lib/python3.10/site-packages/DBSegment/nnunet/inference/predict.py", line 666, in predict_from_folder return predict_cases(model, list_of_lists[part_id::num_parts], output_files[part_id::num_parts], folds, File "/Users/dcoynel/Desktop/dbsegment/env/lib/python3.10/site-packages/DBSegment/nnunet/inference/predict.py", line 208, in predict_cases for preprocessed in preprocessing: File "/Users/dcoynel/Desktop/dbsegment/env/lib/python3.10/site-packages/DBSegment/nnunet/inference/predict.py", line 109, in preprocess_multithreaded pr.start() File "/Users/dcoynel/fsl/lib/python3.10/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/Users/dcoynel/fsl/lib/python3.10/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/Users/dcoynel/fsl/lib/python3.10/multiprocessing/context.py", line 288, in _Popen return Popen(process_obj) File "/Users/dcoynel/fsl/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init super().init(process_obj) File "/Users/dcoynel/fsl/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/Users/dcoynel/fsl/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/Users/dcoynel/fsl/lib/python3.10/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) _pickle.PicklingError: Can't pickle <function at 0x125eec820>: attribute lookup on DBSegment.nnunet.utilities.nd_softmax failed

dcoynel commented 1 year ago

This seems to have been related to an interaction between the virtual environment and FSL-python librairies. I setup the virtual environment on a HPC cluster, and it worked fine.