UCL-SML / Doubly-Stochastic-DGP

Deep Gaussian Processes with Doubly Stochastic Variational Inference
Apache License 2.0
142 stars 48 forks source link

Variance computation #35

Open Hebbalali opened 5 years ago

Hebbalali commented 5 years ago

Hello Hugh, i come again to discuss with you another subject around the doubly stochastic DGP. When predicting using the doubly stochastic DGP you suggest a Gaussian mixture of the variational posteriors obtained by drawing S samples through the DGP. However, using a mixture of Gaussians would make the variance of the distribution obtained tend to zero when we increase the number of samples S (eq 18 in your paper doubly stochastic DGP). Hence, is it a reliable approach to handle the uncertainty quantification in DGPs ?

hughsalimbeni commented 5 years ago

I think you're mixing up averaging over the variables and averaging over the densities: in eq 18 the averaging is over densities, not variables. To obtain the empirical mean and variance from samples you can use the formula (from e.g. here)

ms, vs = model.predict_y(Xs, S)   # each is S, N, Dy

# the first two moments
m = np.average(ms, 0)
v = np.average(vs + ms**2, 0) - m**2