kailaix / ADCME.jl

Automatic Differentiation Library for Computational and Mathematical Engineering
https://kailaix.github.io/ADCME.jl/latest/
MIT License
284 stars 57 forks source link

question about the uncertainty part #70

Open he-del opened 3 years ago

he-del commented 3 years ago

Hi, I have read the part about the uncertainty quantification of neural networks, and I'm confused about the log-likelihood function you mentioned, i.e. -sum((y[:,1] - obs[:,1]).^2)/2σ^2 - sum(x.^2)/2σx^2, especially the value of σ and σw. Is there any reference for this part? Thanks! ​

kailaix commented 3 years ago

Hi, this log likelihood function arises from the Bayesian perspective of view. Consider inferring x given observations y, we have

p(x|y) ∝ p(y|x)p(x) => log(p(x|y)) = log p(y|x) + log p(x) + constant

You might want to look into some literature on Bayesian methods. Maybe this could be helpful: https://www.sagepub.com/sites/default/files/upm-binaries/18550_Chapter6.pdf

he-del commented 3 years ago

Thank you, this is helpful.