piEsposito / blitz-bayesian-deep-learning

A simple and extensible library to create Bayesian Neural Network layers on PyTorch.
GNU General Public License v3.0
918 stars 107 forks source link

A suggestion to Blitz #78

Closed HDRah closed 3 years ago

HDRah commented 3 years ago

Firstly, thanks for your implementation for bayes by backprop in Blitz. This is a very nice tool and helped us a lot. While, I have a minor suggestion and I hope you can consider it. It will be very helpful to not only return a total loss when training the Bayesian layer, but return two separate loss: log likelihood and KL divergence. This could be more beneficial to see how the trade-off between two loss is achieved, and benefit our training process.

flydephone commented 3 years ago

@HDRah Hi HDRah, Since you mentioned log likelihood and KL divergence, can you answer my question below?

https://github.com/piEsposito/blitz-bayesian-deep-learning/issues/83

piEsposito commented 3 years ago

@HDRah , you can already do that with BLiTZ: When using the @variational_estimator decorator, you can calculate the separate losses as:

outputs = self(inputs) 
loss += criterion(outputs, labels)
loss += self.nn_kl_divergence() * complexity_cost_weight

That might solve it.