JuliaRobotics / KernelDensityEstimate.jl

Kernel Density Estimate with product approximation using multiscale Gibbs sampling
GNU Lesser General Public License v2.1
23 stars 7 forks source link

How to evaluate the log probability of fitted model? #27

Closed juliohm closed 5 years ago

juliohm commented 6 years ago

I am trying to guess by reading the source code, but I am not 100% sure. Could you please clarify two basic questions. Assume I have fitted a 4D KDE.

  1. How to evaluate the log-probability at a new point?
  2. How to sample from the KDE?

I think for the latter, there is sample(kdeobj, npts), correct? Could you please comment on the output? I remember it returning a tuple? What is the meaning of the second entry?

For the former, I found a set of eval* methods with very complicated arguments, I appreciate if you can clarify those.

Thanks,

dehann commented 6 years ago

Hi @juliohm , there is a cleaner interface for evaluating the probability in the works:

p = kde!(randn(2,100))
p([0.0;0.0])

The existing method is using evalDualTree as mentioned here #2 . The work is being done by evaluate(...). You are probably looking for evalAvgLogL(...), but a convenient function call has not been implemented yet. Maybe just follow evalDualTree(...) as an example?

EDIT: evaluations are done using the BallTreeDensity class, but there is a convenience wrapper to convert test points into a BallTreeDensity. The new kde!(pts)(evallocations) approach is meant to make use of eval* easier.

dehann commented 6 years ago

Regarding sampling, you can just do

p = kde!(randn(100))
pts = rand(p, 1000)

EDIT: randn -> rand

juliohm commented 6 years ago

Thank you @dehann , I will keep it in mind next time I need KDE in my work, this time I had to rely on the Python's sklearn implementation due to an urgent deadline.