bowang-lab / MedSAM

Segment Anything in Medical Images
https://www.nature.com/articles/s41467-024-44824-z
Apache License 2.0
3.03k stars 419 forks source link

nnUNet #330

Open Pankajg959 opened 1 month ago

Pankajg959 commented 1 month ago

Hey @JunMa11 , I am an academic radiologist. I enjoy experimenting with MedSAM. Congratulations on this significant contribution!! I am comparing MedSAM's segmentation results with those of nnUNet.

  1. There is no training script in the comparisons--> nnU-Net folder. As per the documentation (https://github.com/bowang-lab/MedSAM/blob/main/comparisons/nnU-Net/README.md), "This folder contains the scripts for training and inference of the nnUNet model on medical image data in MedSAM's preprocessed npz format." I expected a training script that would allow me to train using MedSAM's preprocessed images.

  2. When I used a model trained with the standard nnUNetv2 pipeline for inference, there was an error. python infer_nnunet_3D.py -checkpoint /home/pankaj/scratch/nnUNet/nnUNet_results/Dataset200 -data_root /home/pankaj/scratch/MedSAM/data/npz/MedSAM_test/CT_Abd -pred_save_dir /home/pankaj /scratch/MedSAM/nnunet_pred --save_overlay -png_save_dir /home/pankaj/scratch/MedSAM/nnunet_pred/png -num_workers 2 Traceback (most recent call last): File "/scratch/pankaj/MedSAM_main/comparisons/nnU-Net/infer_nnunet_3D.py", line 119, in predictor = nnUNetPredictor( TypeError: nnUNetPredictor.init() got an unexpected keyword argument 'perform_everything_on_gpu'

When I changed the argument to "perform_everything_on_device", there was another error, Traceback (most recent call last): File "/home/pankaj/scratch/ananconda3/envs/medsam/lib/python3.10/multiprocessing/pool.py", line 125, in worker result = (True, func(*args, **kwds)) File "/scratch/pankaj/MedSAM_main/comparisons/nnU-Net/infer_nnunet_3D.py", line 206, in nnunet_infer_npz seg_2D = predictor.predict_single_npy_array( File "/home/pankaj/scratch/ananconda3/envs/medsam/lib/python3.10/site-packages/nnunetv2/inference/predict_from_raw_data.py" , line 444, in predict_single_npy_array dct = next(ppa) File "/home/pankaj/scratch/ananconda3/envs/medsam/lib/python3.10/site-packages/batchgenerators/dataloading/data_loader.py", line 126, in next return self.generate_train_batch() File "/home/pankaj/scratch/ananconda3/envs/medsam/lib/python3.10/site-packages/nnunetv2/inference/data_iterators.py", line 198, in generate_train_batch data, seg = self.preprocessor.run_case_npy(image, seg_prev_stage, props,
File "/home/pankaj/scratch/ananconda3/envs/medsam/lib/python3.10/site-packages/nnunetv2/preprocessing/preprocessors/default_preprocessor.py", line 44, in run_case_npy data = data.astype(np.float32) # this creates a copy AttributeError: 'Tensor' object has no attribute 'astype'. Did you mean: 'dtype'? The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/scratch/pankaj/MedSAM_main/comparisons/nnU-Net/infer_nnunet3D.py", line 259, in for i, in tqdm(enumerate(pool.imap_unordered(nnunet_infer_npz, gt_path_files))): File "/home/pankaj/scratch/ananconda3/envs/medsam/lib/python3.10/site-packages/tqdm/std.py", line 1181, in iter for obj in iterable: File "/home/pankaj/scratch/ananconda3/envs/medsam/lib/python3.10/multiprocessing/pool.py", line 873, in next raise value AttributeError: 'Tensor' object has no attribute 'astype'

  1. I also noticed the link: "The inference scripts assume that the data is in the npz format generated by MedSAM preprocess pipeline. To run inference, one can download the model here and use the provided inference scripts" directs to DeepLab models rather than nnU-Net.

Please help Dr. Pankaj Gupta, M.D.

Pankajg959 commented 1 month ago

Dear @JunMa11 , I am waiting for your response. Thanks

JunMa11 commented 1 week ago

Hi @Pankajg959 ,

Sorry for my late reply. It has been a busy month.

  1. No. training nnunet should convert the npz to nifti or png format.

2&3. It could be that the latest nnunet has updated some functions while we used the version in 2023. I would recommend converting the data to nifti/png format and using the official inference interface.

The nnU-Net team also developed a similar solution in the CVPR challenge. Here is their report for your review.

https://openreview.net/forum?id=N1uNPfYHFw

Sorry again for the late reply. Please feel free to raise any questions.