Closed JelleAalbers closed 6 years ago
Checked with 3d likelihood using binned histograms-- ran as expected, and decreases memory use.
Best_anchor checked for run, sensible returns, but no detailed checks performed.
Caching in the source avoids two problems my solution had:
This lets Source keep a dictionary cache of loaded source-data (like PDFs), which reduces memory footprint if the same source is being loaded many times (for example, in models with many nuisance parameters that don't affect all sources). Thanks to @kdund for spotting and correctly diagnosing the problem.
I also included two bonus features I've been meaning to commit for a while:
best_anchor
inference method, which finds the shape parameter anchor combination that best matches the data. Useful as a guess for further fitting.