Open param1101 opened 11 months ago
Hi @param1101,
If I create a Unet model and use an inferer, does this automatically turn off dropout at inference?
Yes, that's correct.
If so, what can I do to turn on dropout at inference? I want to quantify the uncertainty in predictions using this but didn't really finding anything related to this
Using torch.no_grad() will help you: https://github.com/Project-MONAI/MONAILabel/blob/0.4.0/monailabel/tasks/scoring/epistemic.py#L55
More info about this: https://discuss.pytorch.org/t/model-eval-vs-with-torch-no-grad/19615/2
Here is the class we have created for Epistemic uncertainty computation using Dropout: https://github.com/Project-MONAI/MONAILabel/blob/0.4.0/monailabel/tasks/scoring/epistemic.py#L27
As you can see in this example, it uses the network defined with Dropout: https://github.com/Project-MONAI/MONAILabel/blob/0.4.0/sample-apps/radiology/lib/configs/deepedit.py#L204
And he is more information about it: https://github.com/Project-MONAI/MONAILabel/wiki/Active-Learning#epistemic-based-entropy-uncertainty-using-dropout
Hope this helps,
If I create a Unet model and use an inferer, does this automatically turn off dropout at inference?
If so, what can I do to turn on dropout at inference? I want to quantify the uncertainty in predictions using this but didn't really finding anything related to this