Closed buntyke closed 1 year ago
If I were to implement this functionality into the bayesian gplvm model, could any of the admins or users please provide some pointers.
I am planning to refer to the vargplvm matlab toolbox and implement the dynamics functionality into bayesian_gplvm.py. Would this be a good starting point?
The inference is done in the GPy.inference package. The module you will be looking for is the var_dtc.py. In there the inference for the Bayesian GPLVM is done.
I'd love to see the dynamics in the code. You will have to give additional information to the inference method, I think you can do that at creation time? Also the psi statistics change, if I'm not mistaken, @adamianou?
Am 28.07.2015 um 04:41 schrieb Nishanth Koganti notifications@github.com:
If I were to implement this functionality into the bayesian gplvm model, could any of the admins or useres please provide some pointers.
I am planning to refer to the vargplvm matlab toolbox and implement the dynamics functionality into bayesian_gplvm.py. Would this be a good starting point?
— Reply to this email directly or view it on GitHub.
This could also be done using a deep gp, @adamian, @zhenwendai, @jameshensman any ideas?
Looking forward for this enhancement!
@buntyke Did you try to implement it already? The best starting point is GPy.inference.latent_function_inference.var_dtc
. This is where the normal variational inference is being done for the sparse gp, and with that for the Bayesian gplvm as well.
Let me know if you have any other questions.
I have only assigned @adamian as he is the expert on the matter, not that he will implement it...
@mzwiessele Sorry about that. I switched back to the Matlab toolbox as I was facing a deadline. Thanks for the pointers. I will work on this enhancement soon!
Is there an update on this? Have you gotten somewhere? Any questions?
Was any progress made on this?
We've definitely progressed on some of these areas but I'm not sure how integrated they are with GPy at the moment @adamian may know better.
On Wed, Apr 20, 2016 at 2:14 PM, cwlgadd notifications@github.com wrote:
Was any progress made on this?
— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/SheffieldML/GPy/issues/204#issuecomment-212419010
I would like to implement this. Any comments on how I should do this to fit in with the general framework would be appreciated.
As the inference for this method is very specific you probably have to rewrite a new inference method. Thus, the best starting point is https://github.com/SheffieldML/GPy/blob/devel/GPy/inference/latent_function_inference/var_dtc.py .
You need to implement the inference function, which makes the inference for the bound and gradients given the kernel, latent space X, inducing inputs Z, the likelihood and the observed outputs Y. I am confident you will figure it out by looking at the above file. You will also see the keyword arguments. If they are given to the inference, that means they are pre-computed and do not have to be re-computed. You can probably just outsource the first 50 or so lines and make them a function in var_dtc (s.t. like ensure_statistics, with the keyword arguments as inputs). This then enables you to inherit from vardtc, which will enable you to skip a lot of code (such as the init and get
In the inference you need to return a gradient dict, which holds the partial derivatives of the upper bound L wrt {Kmm, psi<0,1,2>, likelihood variance (dL_dthetaL)}.
The code for the different derivatives in var_dtc will give you a lot of insight into those gradients, especially that you have done the maths for the dynamics.
Hi all,
Has anybody added the dynamics to BayesianGPLVM in python?
Best, Somayeh
@adamian ?
This is not implemented and it's not in my short-term plans. As @mzwiessele mentioned, one would have to write a new inference method. The reparameterization suggested here: http://jmlr.org/papers/v17/damianou16a.html makes inference more robust, but it's a little tricky, so one could skip the reparameterization and just use a standard covariance parameterization.
Alternatively, one can pass the time inputs to a deep GP: https://github.com/SheffieldML/PyDeepGP/
Stale.
In the vargplvm toolbox, there is the option of adding dynamics to the latent space for learning a smooth latent manifold:
I am not able to provide such dynamics to the Bayesian GPLVM module in GPy. Is there a way to achieve this functionality?