shogun-toolbox / shogun

Shōgun
http://shogun-toolbox.org
BSD 3-Clause "New" or "Revised" License
3.03k stars 1.04k forks source link

Multiclass Laplace approximation for GPs #1901

Open karlnapf opened 10 years ago

karlnapf commented 10 years ago

This task is to implement the multiclass version of the Laplace approximation for GPs. This is based on the soft-max likelihood from #1898 and will be used from the GP multiclass machines from #1900 Code for this can be found in the GPstuff toolbox, algorithm again the the GP book or the original barber paper, see #1900

The task requires to implement a few non-trivial algorithms. It would be best if the existing CLaplacianInferenceMethod class could be extended for this case.

yorkerlin commented 8 years ago

this issue is partially solved at https://github.com/shogun-toolbox/shogun/pull/2484

TODO: add implicit gradient wrt hyper-parameter

MikeLing commented 7 years ago

Hi @karlnapf and @yorkerlin , I would like to work on this and #1902 issue. But I haven't too much knowledge about GPs, so could I learn GPs from now and try to fix them by reading codes in Shogun and read relative documents? I do know some basic ideas about ML, so I think the whole process(learn and commit some workable code ) will spend one or two months? Is that ok? Thank you :)

karlnapf commented 7 years ago

Yes contributions are welcome here. Let us know if you need help. Reading existing codes, examples, and the gp book by Rasmussen is probably the best start

aghinsa commented 4 years ago

Is this issue still open?