Open karlnapf opened 10 years ago
this issue is partially solved at https://github.com/shogun-toolbox/shogun/pull/2484
TODO: add implicit gradient wrt hyper-parameter
Hi @karlnapf and @yorkerlin , I would like to work on this and #1902 issue. But I haven't too much knowledge about GPs, so could I learn GPs from now and try to fix them by reading codes in Shogun and read relative documents? I do know some basic ideas about ML, so I think the whole process(learn and commit some workable code ) will spend one or two months? Is that ok? Thank you :)
Yes contributions are welcome here. Let us know if you need help. Reading existing codes, examples, and the gp book by Rasmussen is probably the best start
Is this issue still open?
This task is to implement the multiclass version of the Laplace approximation for GPs. This is based on the soft-max likelihood from #1898 and will be used from the GP multiclass machines from #1900 Code for this can be found in the GPstuff toolbox, algorithm again the the GP book or the original barber paper, see #1900
The task requires to implement a few non-trivial algorithms. It would be best if the existing CLaplacianInferenceMethod class could be extended for this case.