kumar-shridhar / PyTorch-BayesianCNN

Bayesian Convolutional Neural Network with Variational Inference based on Bayes by Backprop in PyTorch.
MIT License
1.42k stars 323 forks source link

Can I drop the KL term in loss function. #59

Closed LeavesLei closed 3 years ago

LeavesLei commented 3 years ago

Hi, I want to know whether I can drop the KL term, i.e., set beta=0 in the loss function. Will it hurt the training process? Does It still belong to Bayes by Backprop? Thx and looking forward to your reply 😄.

kumar-shridhar commented 3 years ago

Hi,

You can do that but it will hurt the performance in some way as there will not be any regularisations or constraints induced and parameters will be more free. In many cases, over time KL becomes 0 or too high and there are ways to counter that like KL annealing.