piEsposito / blitz-bayesian-deep-learning

A simple and extensible library to create Bayesian Neural Network layers on PyTorch.
GNU General Public License v3.0
931 stars 105 forks source link

Question about a parameter of model.sample_elbo #108

Open KQL11 opened 2 years ago

KQL11 commented 2 years ago

Hi, I'm grateful to your code. I have a question about "complexity_cost_weight". According your code, I think the "complexity cost weight" should be 1/batch_size. However, in your example "LeNet_MNIST.py", the "complexity cost weight" is 1/50000, and 50000 is the number of all training data. So how should I choose the parameter? Batch size or the number of all trianing data?

tzuhungbrian commented 1 year ago

Same question here

mliyuanjie commented 1 year ago

Hi, I'm grateful to your code. I have a question about "complexity_cost_weight". According your code, I think the "complexity cost weight" should be 1/batch_size. However, in your example "LeNet_MNIST.py", the "complexity cost weight" is 1/50000, and 50000 is the number of all training data. So how should I choose the parameter? Batch size or the number of all trianing data?

I guess this parameter is not important, its just some scale for loss value. it probably has not influence on fitting.