stefanradev93 / BayesFlow

A Python library for amortized Bayesian workflows using generative neural networks.
https://bayesflow.org/
MIT License
297 stars 45 forks source link

Has this situation come across to you? #113

Closed Shuwan-Wang closed 9 months ago

Shuwan-Wang commented 9 months ago

Hi Stefan et al.,

I came across a situation that no matter how many ACBs I have stacked up (I have tried up to 13 ACBs), the transformed latent variables Zs are not perfectly following N(0,1) as the examples shown in paper/your example code.

Has this situation come across to you? Do you have any suggestions (if the transformed latent variables are not perfectly trained to be N(0,1), then the posterior draws from the inference phrase wouldn't be valid, right?)

Thank you so much! Best, Shuwan

stefanradev93 commented 9 months ago

Shuwan,

the latent samples will seldomly follow a spherical Gaussian perfectly. Since all neural inference is approximate, the question is now whether or not, but rather: "How bad is the approximation"? You can answer this by looking at the various metrics for parameter recoverability or model re-simulations as a whole.

Also, increasing the number of coupling layers is only one of multiple ways to improve performance. If you find that it doesn't make much of a difference, then you should try other training hyperparameters, such as learning rate and number of training samples. If you have a summary network, the reason for poor performance may also reside in its architecture, so you may want to try improving the summary network as well. In addition, you can try switching to spline flows (coupling_design='spline') and using significantly less coupling layers.

marvinschmitt commented 9 months ago

Hi Shuwan,

on a general note: We have just launched the BayesFlow Forums, a dedicated place to ask for support and tips. Feel free to ask any future workflow questions right over there:

https://discuss.bayesflow.org/

Thanks, Marvin