Closed mdnunez closed 2 months ago
Hi Michael,
thank you for your interest in our work. I will be happy to read the pre-print when it is available, as this will be one of the few showcases for simulation-based joint modeling!
We have been playing with the idea of estimating hierarchical models for a while and actually want to enable this in the upcoming future
branch of BayesFlow. For that, we have started with the "simpler" problem of hierarchical model comparison, for which we will soon have a pre-print and open-source code.
As for the multilevel estimation you describe, there is still work to be done, but we are positive that we will achieve this in the upcoming months. One challenge here is that multilevel neural estimation requires hacky transformations of the neural architectures in order to properly account for all dependencies implied by the underlying graphical model. Stay tuned!
Best, Stefan
Here is a short conference talk on our integrated single-trial joint modeling work using BayesFlow (mentioned about midway through the video): https://www.youtube.com/watch?v=WB7mIyc_1dw
The preprint is almost done!
Really cool stuff! Looking forward to reading the pre-print!
Here is the preprint! I definitely look forward to the future of this research with hierarchical models.
Hi, I was wondering if there is any news about this?
Stay tuned! The hierarchical
branch currently features everything needed for the model family Michael described. We are currently doing some systematic benchmarking against Stan before the code becomes ready for "production" and expect to have a pre-print and some tutorials quite soon.
Our paper on this topic can be found at https://arxiv.org/abs/2408.13230. In our new bayesflow version we will also feature an improved version of the TwoLevelApproximator we currently have implemented only in the old backend.
Hi Stefan et al.,
Thank you for your research and building this fantastic package. Amin Ghaderi and I are currently building and testing joint models of EEG and behavioural data for each experimental trial. This necessitates approximating posteriors from joint likelihoods on the single-trial level, so we are using BayesFlow. We will likely have a preprint ready within the next couple of weeks.
We are thinking about the future of using BayesFlow for joint modeling (e.g. neural and behavioural data) research. We often use hierarchical models, which we have often implemented using JAGS and Stan etc (https://github.com/mdnunez/pyhddmjags).
Does BayesFlow currently have the ability to fit hierarchical models? And have you thought about his line of research? I suspect it would necessitate some chain of invertible blocks?
A simple hierarchical model extension of the existing DDM example would be a model in which there are many experimental conditions c (e.g. 10). But there is also a hierarchical drift rate parameter, the mean of the condition drift rates:
Likelihood: choice-RT ~ DDM(drift[c], ...) Prior: drift[c] ~ Normal(mu_drift, std_drift^2) Hyperprior: mu_drift ~ Uniform(-7, 7), etc.
This isn't as simple as including an extra normal distribution in the simulator because we want to estimate marginal posterior distributions for the 10 condition drifts: delta[c], as well as marginal posterior distributions for the hierarchical parameters mu_drift and std_drift. I also believe there is a fundamental difference between the model above and this model: Likelihood: choice-RT ~ Complicated_likelihood(mu_drift, std_drift^2, ...) Prior: mu_drift ~ Uniform(-7, 7), etc.
Kind regards,
Michael