sseemayer / mixem

Pythonic Expectation-Maximization (EM) implementation for fitting mixtures of probability densities
https://mixem.readthedocs.org/en/latest/
26 stars 16 forks source link

Feature: PDF function #1

Closed ghost closed 5 years ago

ghost commented 7 years ago

Hi! I am trying to get familiar with your library. It's simple but useful!

However, I have a suggestions:

You could make each distribution callable e.g.:

class NormalDistribution(object):
    def __call__(self, x):
        import scipy.stats as stats
        return stats.norm.pdf(x, self.mu, self.sigma)

This way you implement a PDF feature mixe.model.pdf(weights, distributions) which e.g. returns the weighted linear combination of all distributions:

def pdf(x, weights, distributions):
    for w, d in zip(weights, distributions):
        result += w*d(x)
    return result / sum(result)

or something like that.

sseemayer commented 7 years ago

Thanks for the kind words and the nice suggestion!

mixem currently uses log-pdf values implemented in log_density since I found that to be more numerically stable. Since using log-pdfs might be counter-intuitive, I opted for 'explicit is better than implicit' in that the log_density function on a distribution has to be called by name.

I don't see a problem with adding a bit of sugar to get the PDF of a distribution and a mixture model the way that you suggest. You could simply implement __call__ on the Distribution base class, returning np.exp(self.log_density(x)) and then use that for a pdf implementation instead of the current one. Fancy making a PR?

sseemayer commented 5 years ago

Closing for inactivity.