Closed starsky68 closed 4 years ago
If you meant to change the loss function, then yes it is definitely doable. What do you mean by re-parameters?
If you meant to change the loss function, then yes it is definitely doable. What do you mean by re-parameters?
If you meant to change the loss function, then yes it is definitely doable. What do you mean by re-parameters?
Does the meaning of reparameterization look like this? Reparametrization tricks : https://github.com/Harry24k/bayesian-neural-network-pytorch/blob/e6c6a858064319008912d3a91941a802769520ef/torchbnn/modules/linear.py#L86.
https://github.com/xuanqing94/BayesianDefense/blob/c4c0be9b258f40130b40d6a6e009c459666f2722/main_vi.py#L97.Excuse me, why divide by 100 in this calculation formula.
@starsky68
https://github.com/xuanqing94/BayesianDefense/blob/c4c0be9b258f40130b40d6a6e009c459666f2722/utils/loss.py#L16.Thank you for your answers. I have another question. In this formula, is there any difference between adding or subtracting klloss? Is there any difference? Or did I miss some details?
@starsky68 You should add the KL term, as shown in the paper (Eq 13 page 5).
Can KLLoss and re-parameters be customized?