JavierAntoran / Bayesian-Neural-Networks

Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
MIT License
1.83k stars 302 forks source link

2 questions: batch & layer #25

Open EvannaB opened 1 year ago

EvannaB commented 1 year ago

Here are two questions, and would you please help me with them:

  1. In the COLAB notebook, this code didn't train in batches, right?
  2. In the COLAB notebook, the MC_Dropout_Model can run without the MC_Dropout_Layer?
JavierAntoran commented 1 year ago

Hi @EvannaB

  1. I believe that all models should train in minibatches by default.
  2. You can remove the MC dropout layer, but you will remove the uncertainty estimation capabilities of the model if you do this.