Open innuo opened 4 years ago
Does that setup depend on the generative model at all? If I understand correctly, you are writing a custom loss function and then optimizing that loss function. (And depending on the choice of kernel, that optimization may or may not correspond to MAP estimation in the above generative model.)
@bzinberg You are right. I should've stated my question more clearly as the following. I am trying to perform inference on a model where the observables are deterministic functions of some latent variables (like in my example above). What would be a recommended inference procedure using functionality that Gen provides?
I am trying to implement some inference algorithms where my forward model is specified implicitly (i.e., likelihood is not available). I was wondering what the recommended way would be to either use or extend gen for this purpose, to use the rest of its very nice functionality.
For example, I have a model of the type z ~ Normal(\mu, \Sigma) x = f(z, \theta)
I have observations for x (x_i), and I would like to estimate \theta with back-propagation and infer z_i. I plan on optimizing a kernel maximum mean deviation (MMD) based loss between the generated and observed x_i for the estimation.