ModellingWebLab / project_issues

An issues-only repository for issues that cut across multiple repositories
1 stars 0 forks source link

Repeat CMA-ES for cell 5, ensure we get the same result #37

Closed MichaelClerx closed 6 years ago

MichaelClerx commented 6 years ago

With 25 repeats (and the pints default stopping criteria) I get the same result as Kylie:

Best 3 scores:
79.9154022423
79.9154028928
79.9154032748
Mean & std of score:
4839.65078784
3108.98116422
Worst score:
7551.94493984
Obtained parameters:
 2.26136868689461094e-04
 6.99135634032594377e-02
 3.44969468557318105e-05
 5.46117622784167422e-02
 8.73227521126475892e-02
 8.92987836542377331e-03
 5.14887953738658085e-03
 3.15622060118181946e-02
 1.52432782134508338e-01
Final score:
79.9154022423
Sigma noise: 0.00462852386082
Log-likelihood: -1.51041369967082329e+06

Will need to get some better output from pints. For example, the mean/std which I put in is pretty useless as it's massively skewed. Would want all scores, median, etc. Will work on that!

Compariong with Kylie's published (MCMC) result:

K: -1.51063175578740e+006
M: -1.51041369967082329e+06

K: 2.26026076650526e-04
M: 2.26136868689461094e-04
K: 6.99168845608636e-02
M: 6.99135634032594377e-02
K: 3.44809941106440e-05
M: 3.44969468557318105e-05
K: 5.46144197845311e-02
M: 5.46117622784167422e-02
K: 8.73240559379590e-02
M: 8.73227521126475892e-02
K: 8.91302005497140e-03
M: 8.92987836542377331e-03
K: 5.15112582976275e-03
M: 5.14887953738658085e-03
K: 3.15833911359110e-02
M: 3.15622060118181946e-02
K: 1.52395993652348e-01
M: 1.52432782134508338e-01

So first two digits equal in all cases, typically first three! Formalise this, get better output, and move one?

MichaelClerx commented 6 years ago

This was done with a sum-of-squares error by the way, which should be linearly proportional to -1 * the log-likelihood

mirams commented 6 years ago

I can't remember whether we let CMA-ES loose on the log posterior rather than the log likelihood. In terms of getting to the maximum posterior density point, rather than the maximum likelihood point, this is probably what we should do.

MichaelClerx commented 6 years ago

I don't see a posterior even in the mcmc code!

On Fri, 08 Dec 2017 11:55:39 +0000 (UTC) Gary Mirams notifications@github.com wrote

I can't remember whether we let CMA-ES loose on the log posterior rather than the log likelihood.

-- You are receiving this because you were assigned. Reply to this email directly or view it on GitHub: https://github.com/ModellingWebLab/project_issues/issues/37#issuecomment-3502

MichaelClerx commented 6 years ago

Works now!