Closed akucukelbir closed 7 years ago
Before deciding, perhaps it's useful to think where MFVI
situates among all current and future implemented algorithms. Of course, we don't have the foresight to know all algorithms or even know the best conceptual foundation on how to organize them. But it could be a useful thought exercise.
Here is my attempt at a flattened list below (my cognitive biases already led me to put down some structure as I wrote them). Not all apply to the same class of probability models.
inference methods
inference-independent features
variational models
This is cool. I can kind of already see the skeleton of additional scaffolding. What am I forgetting?
Some ideas here for modular pieces/transforms/concepts that span stats/ml/dl : https://github.com/JuliaML/Roadmap.jl/issues/8
Looks useful! Thanks for the link. Certainly interesting to see how others aim to organize all learning concepts.
I think we're able to support most of that roadmap (or TensorFlow as the base already does), using the core abstractions in Box's loop: model
, inference
, and criticism
. I think using Transformation
as the primary abstraction is misguided. It's a very supervised way of thinking about the world. But it could just be because I strongly believe BDA is the best modus operandi.
I agree but It seems to me that transformation is at a lower level of abstraction than model, inference
and criticism
therefore they aren't mutually exclusively......though I could be wrong.
I'm also partial to BDA :)
Can Edward (or tensorflow) differentiate a function written in numpy with loops?
renamed in #344
MFVI is currently a catch-all class: it captures BBVI, ADVI (not implemented yet) and various extensions.
perhaps it makes sense to rename it
KLqp
?or perhaps it makes sense to rename both KL-based methods as
KLforward
andKLreverse
?what do you guys think?