Closed renecotyfanboy closed 4 months ago
I was wondering what would be your take on implementing features similar to HMCGibbs for non HMC samplers, such as the brand-new ensemble sampler or the previous SA ?
while numpyro has a pretty sizable set of mcmc kernels at this point, we have generally been more focused on variational inference so we don't have a particularly flexible mcmc kernel api. the right approach would probably be to
HMCGibbs
with that APISAGibbs
or otherwise add tutorials etc that make it clear how to create such custom kernels from smaller componentsAlso, what would be the approach for doing so with variational inference ?
i don't understand this question
- create such an api
- re-implement things like HMCGibbs with that API
- possibly expose new things like SAGibbs or otherwise add tutorials etc that make it clear how to create such custom kernels from smaller components
Thank you for your answer, I'll take a look about how feasible this would be on my side
Also, what would be the approach for doing so with variational inference ?
This was a more general question, I am not yet familiar with VI in general. For my specific problem, I have a parameter for which I know the posterior distribution, and I used the HMCGibbs
to sample directly from this distribution, and was wondering on how to expand this knowledge to the case of VI (which is directly sampling from this distribution instead of fitting it with a guide). I may be a dumb question and I am sorry if this is the case, thank you anyway
well if your goal is to "fix" a distribution in VI you could just set the approximate posterior distribution in the guide
to that distribution (i.e. not learn it) and you'd be done
Hi, I was wondering what would be your take on implementing features similar to
HMCGibbs
for non HMC samplers, such as the brand-new ensemble sampler or the previous SA ? Also, what would be the approach for doing so with variational inference ?