andreashlarsen / BayesFit

Fitting of SAXS or SANS data with Bayesian inference
GNU General Public License v3.0
0 stars 0 forks source link

automatic differentiation #3

Open andreashlarsen opened 1 day ago

andreashlarsen commented 1 day ago

I thank reviewer 3 for raising this issue (rephrased by AHL):

The normalised curvature matrix [used to find the optimal value of alpha] is constructed numerically from chi2 using a forward Euler method (numerical differentiation). When working out the second differential of chi2 using numerical differentiation the process can be sensitive to various numerical instabilities.

I would strongly suggest that you look into automatic differentiation (e.g. jax, https://jax.readthedocs.io/en/latest/jax-101/04-advanced-autodiff.html, https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html).

andreashlarsen commented 1 day ago

An addition to the above issue, also from reviewer 3:

I would then suggest to write a function for the log-prior, a function for the log-likelihood (i.e. chi2), then combine to make a log-prior function. If you write them in the correct manner you can use these straightforwardly for automatic differentiation. This has several knock-on effects: a. you can work out these curvature matrices easily. b. It's then straightforward to use most Bayesian analysis packages (pymc/emcee/dynesty) c. automatic differentiation is very powerful when performing MCMC sampling using e.g. pymc. This has an MCMC sampler called NUTS to derive posterior probability distributions, this is greatly sped up if you have an easy way to calculate gradients/hessians.