Open aronayne opened 5 years ago
Hi @yaringal
Thanks for sharing this information.
The code you posted on your blog (http://mlg.eng.cam.ac.uk/yarin/blog_3d801aa532c1ce.html#uncertainty-sense) is to be utilized as part of training, therefore T = the number of network training epochs ?
probs = [] for _ in xrange(T): probs += [model.output_probs(input_x)] predictive_mean = numpy.mean(prob, axis=0) predictive_variance = numpy.var(prob, axis=0) tau = l**2 * (1 - model.p) / (2 * N * model.weight_decay) predictive_variance += tau**-1
As 1 forward pass is utilized to make a prediction p on a trained network, when performing the uncertainty calculation of p does T become 1 so then
for _ in xrange(T): probs += [model.output_probs(input_x)]
is just :
probs += [model.output_probs(input_x)] ?
probs += [model.output_probs(input_x)]
Hi @yaringal
Thanks for sharing this information.
The code you posted on your blog (http://mlg.eng.cam.ac.uk/yarin/blog_3d801aa532c1ce.html#uncertainty-sense) is to be utilized as part of training, therefore T = the number of network training epochs ?
As 1 forward pass is utilized to make a prediction p on a trained network, when performing the uncertainty calculation of p does T become 1 so then
is just :
probs += [model.output_probs(input_x)]
?