yulequan / UA-MT

code for MICCAI 2019 paper 'Uncertainty-aware Self-ensembling Model for Semi-supervised 3D Left Atrium Segmentation'.
https://arxiv.org/abs/1907.07034
477 stars 97 forks source link

Implementation of Monte Carlo Dropout #5

Closed luciaL closed 4 years ago

luciaL commented 4 years ago

Hello, @yulequan

I read your code carefully and uncertainty is a new idea for me. I see your implementation of Monte Carlo Dropout like below:

T = 8 volume_batch_r = unlabeled_volume_batch.repeat(2, 1, 1, 1, 1) stride = volume_batch_r.shape[0] // 2 preds = torch.zeros([stride * T, 2, 112, 112, 80]).cuda() for i in range(T//2): ema_inputs = volume_batch_r + torch.clamp(torch.randn_like(volume_batch_r) * 0.1, -0.2, 0.2) with torch.no_grad(): preds[2 * stride * i:2 * stride * (i + 1)] = ema_model(ema_inputs) preds = F.softmax(preds, dim=1) preds = preds.reshape(T, stride, 2, 112, 112, 80) preds = torch.mean(preds, dim=0) #(batch, 2, 112,112,80) uncertainty = -1.0*torch.sum(preds*torch.log(preds + 1e-6), dim=1, keepdim=True)

I wonder if this is the most common way to implement uncertainty and mean teacher method? I think your implementation is to add perturbation to inputs but not dropout to network although both way is to add regularization.

yulequan commented 4 years ago

Hi,

I have added both dropout and perturbation to inputs when calculating the uncertainty.

luciaL commented 4 years ago

Oh, Could you please tell me the line of calculating the uncertainty? Thank you so much for your reply. Best wishes.

yulequan commented 4 years ago

The above code you mentioned is to calculate the uncertainty. Adding different perturbation to the inputs and calculate the entropy of the average prediction. Besides calculating entropy, you can also calculate the variance.

luciaL commented 4 years ago

Okay, got it. Thank you so much.

1215232494 commented 2 years ago

Hello, @yulequan @luciaL Why need to repeat, this step I do not understand.