Project-MONAI / MONAILabel

MONAI Label is an intelligent open source image labeling and learning tool.
https://docs.monai.io/projects/label
Apache License 2.0
591 stars 189 forks source link

Wrong result of spleen segmentation #872

Closed Ichiruchan closed 2 years ago

Ichiruchan commented 2 years ago

When i use OHIF Viewer with monai label. There is a problem that the result of auto spleen segmentation is not correct like this image.

企业微信截图_60325a8b-aaeb-4e74-9d83-13bac3efec87

And with comparison of auto segmentation

企业微信截图_cf6662ed-a1dc-4077-8858-991a263e983b

It seems that the spleen model annotated liver. It is strange. The spleen model is downloaded from github monai label release. The content of configs/segmentation_spleen.py and infers/segmentation_spleen.py is latest. And the result of deepedit is correct.

企业微信截图_f3a1ee1d-3738-4d7b-90f4-97fb325309ff

Here are some logs. But i think there is no useful info.

[2022-07-14 07:54:53,918] [96] (monailabel.interfaces.tasks.infer:364) - Infer model path: /code/apps/radiology/model/pretrained_segmentation_spleen.pt
[2022-07-14 07:54:54,609] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:62) - POST - Run Transform(s)
[2022-07-14 07:54:54,609] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:63) - POST - Input Keys: ['model', 'image', 'result_extension', 'result_dtype', 'result_compress', 'description', 'device', 'image_path', 'image_meta_dict', 'image_transforms', 'pred']
[2022-07-14 07:54:54,610] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:99) - POST - Transform (EnsureTyped): Time: 0.0001; image: torch.Size([1, 360, 360, 416])(torch.float32); pred: torch.Size([2, 360, 360, 416])(torch.float32)
[2022-07-14 07:54:54,612] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:99) - POST - Transform (Activationsd): Time: 0.0018; image: torch.Size([1, 360, 360, 416])(torch.float32); pred: torch.Size([2, 360, 360, 416])
POST - Transform (ToNumpyd): Time: 0.1910; image: torch.Size([1, 360, 360, 416])(torch.float32); pred: (1, 360, 360, 416)(float32)
[2022-07-14 07:54:54,614] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:99) - POST - Transform (AsDiscreted): Time: 0.0021; image: torch.Size([1, 360, 360, 416])(torch.float32); pred: torch.Size([1, 360, 360, 416])(torch.float32)
[2022-07-14 07:54:54,805] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:99) - POST - Transform (ToNumpyd): Time: 0.1910; image: torch.Size([1, 360, 360, 416])(torch.float32); pred: (1, 360, 360, 416)(float32)
[2022-07-14 07:54:54,918] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:99) - POST - Transform (Restored): Time: 0.1127; image: torch.Size([1, 360, 360, 416])(torch.float32); pred: (512, 512, 84)(float32)
2022-07-14 07:54:54,960] [96] [MainThread] [INFO] (monailabel.interfaces.utils.transform:99) - POST - Transform (BoundingBoxd): Time: 0.0414; image: torch.Size([1, 360, 360, 416])(torch.float32); pred: (512, 512, 84)(float32)
[2022-07-14 07:54:54,960] [96] [MainThread] [INFO] (monailabel.interfaces.tasks.infer:532) - Writing Result...
[2022-07-14 07:54:54,961] [96] [MainThread] [INFO] (monailabel.transform.writer:94) - Result ext: .nrrd; write_to_file: True; dtype: uint16(torch.float32)

Thank you for your reply!

diazandr3s commented 2 years ago

Hi @Ichiruchan,

Thanks for reporting this. We've created the spleen segmentation model just as an example to demonstrate how you can use MONAI Label. What you experience can happen as the model was not trained extensively :/ Can you please try the segmentation model and see how that goes?

By default, the segmentation model should segment all these organs: https://github.com/Project-MONAI/MONAILabel/blob/main/sample-apps/radiology/lib/configs/segmentation.py#L34-L46

Hope this helps

Ichiruchan commented 2 years ago

Hi @Ichiruchan,

Thanks for reporting this. We've created the spleen segmentation model just as an example to demonstrate how you can use MONAI Label. What you experience can happen as the model was not trained extensively :/ Can you please try the segmentation model and see how that goes?

By default, the segmentation model should segment all these organs: https://github.com/Project-MONAI/MONAILabel/blob/main/sample-apps/radiology/lib/configs/segmentation.py#L34-L46

Hope this helps

Actually, the second image above is the result of segmentation model. I tried it again and manually changed the segmentation's color of spleen part as blue-green to make it visible. Obviously, the result is correct. That's why i think maybe the spleen segmentation model has some problem.

企业微信截图_4a8c0950-6542-4bc8-8dfe-d8810ffb98cd
diazandr3s commented 2 years ago

Hi @Ichiruchan,

Thanks for the follow-up.

Actually, the second image above is the result of segmentation model.

Sorry I missed this bit

As the segmentation model wasn't extensively trained on multiple datasets, it is possible you face these sorts of issues when inferencing on a dataset different to the Task 09 Spleen medical segmentation decathlon dataset.

I hope this makes sense.

diazandr3s commented 2 years ago

As this is a model issue rather than a code issue, I'm closing this one.

Please reopen if there is something unclear. Thanks again!