Open amifalk opened 7 months ago
Very cool! Thanks for sharing.
@amifalk Do you have some idea of the speedup ?
It depends on how many chains you run, whether or not you have a gpu, the amount of native python code in your model, etc., but it can often be a few orders of magnitude faster.
Greetings!
I've ported a subset of emcee functionality to the NumPyro project under the sampler name AIES.
(For the uninitiated, NumPyro uses JAX, a library with an interface to numpy and additional features like JIT compiling and GPU support, in the backend. The upshot is that if you're using currently using emcee, switching to NumPyro may give you a dramatic inference speedup!)
I've tried my best to match the existing API. You can use either the NumPyro model specification language
or provide your own potential function.
Hope this is helpful to some folks!