yaringal / DropoutUncertaintyDemos

What My Deep Model Doesn't Know...
MIT License
116 stars 23 forks source link

Calculation of Tau is different from that on the blog #2

Open napsternxg opened 7 years ago

napsternxg commented 7 years ago

Hi @yaringal,

Thanks for the wonderful post and the code. I was going through your blogpost and found that there are some inconsistencies in the way you calculate tau in the blog, the supportive python code as well as in this repo at the following line:

https://github.com/yaringal/DropoutUncertaintyDemos/blob/92a371ecfd00fdb514f1811778d44e9aad0ae1c9/convnetjs/regression_uncertainty.js#L164

As I understand, you directly calculate tau_inv in the js code. However, on the line above you add the tau_inv to the std. dev of y instead of the variance of y as mentioned on the blog. Is there a specific reason for this ?

I have tried to replicate your analysis using pytorch at: https://github.com/napsternxg/pytorch-practice/blob/master/Pytorch%2BUncertainity.ipynb, based on the method proposed on the blogpost. However, I am not getting the same uncertainties. Specifically, the uncertainty, it appears, is constant across all X, when in reality it should be high at the edges.

I would appreciate if you can help me with this.

yaringal commented 7 years ago

Well spotted! The tau inverse (ie variance) should be inside the squared root. I'll have a look at your code when I get back to the UK next week (from a look at the video I would say it might be an issue with the prior length scale or the bias initialization) Yarin

napsternxg commented 7 years ago

Thanks. What is usually the range of length scaling? In your code you directly use l2 = length-scale squared. Or is it like hyperparameter, I need to try different values to see which one works.

yaringal commented 7 years ago

it's a hyperparameter which depends on the function and the range of the X (incidentally, make sure your data is normalised)

napsternxg commented 7 years ago

Oh thanks. Yes, I didn't normalize my data, but made sure that the range of X was between -1, 1 in an undated code and that improved things a bit. Do I need to normalize y as well ?

yaringal commented 7 years ago

Ideally yes with neural networks

On Thu, 12 Oct 2017, 12:46 Shubhanshu Mishra, notifications@github.com wrote:

Oh thanks. Yes, I didn't normalize my data, but made sure that the range of X was between -1, 1 in an undated code and that improved things a bit. Do I need to normalize y as well ?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/yaringal/DropoutUncertaintyDemos/issues/2#issuecomment-336196280, or mute the thread https://github.com/notifications/unsubscribe-auth/ADpAqbCSRVMRdtQYocNTj_EcpUhEFDcrks5srkJrgaJpZM4P2Qur .