JaxGaussianProcesses / GPJax

Gaussian processes in JAX.
https://docs.jaxgaussianprocesses.com/
Apache License 2.0
432 stars 51 forks source link

dev: VI: Refactor natural gradients, add dual parameterisation, add pseudo data. #181

Open daniel-dodd opened 1 year ago

daniel-dodd commented 1 year ago

Though mentioned in meeting minutes, making my development on this public. Will tick things off as and when they are completed.

Checklist

Add:

Rethink:

Refactor:

Updates

19-Jan:

daniel-dodd commented 1 year ago

As an aside to the above, it would be nice to add a high-level sklearn function to train and quickly "update" (or fantasise) new data for use in active learning.

from tunegp import SVGP

svgp = SVGP()
svgp.fit(X_train, y_train) # train with natural gradients.
svgp.update(X_new,y_new) # fantasise new data.
svgp.predict(X_test) # predict
github-actions[bot] commented 6 days ago

There has been no recent activity on this issue. To keep our issues log clean, we remove old and inactive issues. Please update to the latest version of GPJax and check if that resolves the issue. Let us know if that works for you by leaving a comment. This issue is now marked as stale and will be closed if no further activity occurs. If you believe that this is incorrect, please comment. Thank you!