Closed drbenvincent closed 5 years ago
Progress so far. Note:
logk
value, we are not yet drawing it from a distribution.entropy()
method on the scipy distribution object.This is approaching completion. I now use seaborn
to do the plotting. It is nice. But I need to work out exactly what the error bars represent - it's slightly ambiguous from the documentation, but it looks like 95% CI on bootstrap estimates of the mean. That is, it does not represent the actual spread of the data. This is fine, but I want to know that for sure. I have an issue here: https://github.com/mwaskom/seaborn/issues/1619
Also, still need to decide between plotting posterior entropy of a normal distribution fitted to the posterior, or if I should do it based on a discrete prob density of the raw posterior samples. The latter is what I did in the original Matlab version.
Now the plot conveys the spread over the various simulations. Shaded zones correspond to sd.
Goal
Be able to generate figures like this
Approach
It's probably easiest/best to calculate entropy by:
fit
method on the appropriate scipy.stats distribution object. Calculate the marginal entropy for that parameter using the fitted parameters from the previous step.logk
from prior for each repetitionNOTE: This implementation is done assuming the univariate distributions are Gaussian distributed.