ae-foster / pyro

Deep universal probabilistic programming with Python and PyTorch
http://pyro.ai
Other
10 stars 5 forks source link

Best practice for MC in pyro #13

Open ae-foster opened 5 years ago

ae-foster commented 5 years ago

I think pyro.contrib.oed has deviated from best practice for doing Monte Carlo estimation in pyro. Let's look at how I currently obtain multiple, independent samples from a model:

First

def lexpand(A, *dimensions):
    """Expand tensor, adding new dimensions on left."""
    return A.expand(tuple(dimensions) + A.shape)

Then, in eig.py

# Take N samples of the model
expanded_design = lexpand(design, N)  # N copies of the model
trace = poutine.trace(model).get_trace(expanded_design)

What's the point of this versus something like EmpiricalMarginal? This approach uses tensorization nicely and we can run the simulations in parallel: in practice it is much faster than running N simulations of the model in series, by creating multiple traces or something. Another appealing thing is that I can control the shape of the output tensor: e.g. if I want NM samples in a grid (e.g. to sum over one dimension, do something else on another) I just lexpand(design, N, M).

The problem: this is not pyro- I need some code inside my models that expands everything to match the dimensions of the design input. Is there a tensorized way to take independent samples of a model?

ae-foster commented 5 years ago

A place to look would be https://github.com/ae-foster/pyro/blob/oed-master/pyro/contrib/oed/eig.py#L703 which is my implementation of the ELBO