krasserm / bayesian-machine-learning

Notebooks about Bayesian methods for machine learning
Apache License 2.0
1.81k stars 460 forks source link

Variables t_mean and t_log_var are globals #1

Closed civilinformer closed 5 years ago

civilinformer commented 5 years ago

I was trying to figure out how to code up the variational autoencoder notebook (the one with predictors to structure the latent space) in some systematic way to reproduce it. I did this by trying to stick all of the model generation into function calls. Unfortunately doing that hides the important globals t_mean and t_log_var, perhaps the two most important variables in the entire scheme. Even all of the loss functions access these variables through implied globals. While this seems to work, its highly unsatisfactory when one is trying to build clean classes. How can we do this in a more functional manner within the Keras framework? Maybe auxiliary variables can be passed in function calls? I dont even know how the global variables get updated as training progresses. What guarantees are there that these variables have what we think they have or even whether the loss functions are properly accessing them?

krasserm commented 5 years ago

If you want to follow an FP approach, pass these variables as function arguments and/or return values as needed. They are part of the computational graph and updated accordingly. If you rather want to follow an OO approach, make them members of a class that encapsulates architecture and losses. Feel free to open a PR, would make it easier to discuss.