bayesiains / nflows

Normalizing flows in PyTorch
MIT License
845 stars 118 forks source link

Rewrote BoxUniform such that it can be used as a base for a flow #17

Closed rbvh closed 2 years ago

rbvh commented 4 years ago

Hello,

Back again trying to contribute something useful. I found that if I use BoxUniform as the base distribution for a flow model I get

log_prob() got an unexpected keyword argument 'context'

I rewrote BoxUniform to look more like StandardNormal to prevent that issue. I could also rename it if the original BoxUniform needs to stay around.

Cheers, Rob

sambklein commented 3 years ago

Could this or something like it please be accepted

MilesCranmer commented 3 years ago

+1

See #33 which is a more general fix for this as well as for other torch.distributions.Independent objects

sitmo commented 3 years ago

Ah cool, thanks! I was wondering if it were possible to have have context dependent transforms. I thought the only way to providing a context was via the base distribution! I'm going to try that now!

On Thu, Feb 18, 2021, 19:24 Miles Cranmer notifications@github.com wrote:

@MilesCranmer commented on this pull request.

In nflows/distributions/uniform.py https://github.com/bayesiains/nflows/pull/17#discussion_r578649312:

  • raise ValueError(
  • "low and high are not of the same size"
  • )
  • if not (low < high).byte().all():
  • raise ValueError(
  • "low has elements that are higher than high"
  • )
  • self._shape = low.shape
  • self._low = low
  • self._high = high
  • self._log_prob_value = -torch.sum(torch.log(high - low))
  • def _log_prob(self, inputs, context):
  • Note: the context is ignored.

I did get the spline code working at some point, I don't remember if I made changes but I don't think you need to.

Re: the context, actually it should work with a context-less base distribution and context-full transformations! This is what I need this modification for actually.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/bayesiains/nflows/pull/17#discussion_r578649312, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAILJQ6ITVL5TF2OM7YWMUTS7VLNBANCNFSM4QMB53EQ .

MilesCranmer commented 3 years ago

No problem!

The conditional moons example actually does both context for the base distribution AND the transform:

base_dist = ConditionalDiagonalNormal(shape=[2], 
                                      context_encoder=nn.Linear(1, 4))

transforms = []
for _ in range(num_layers):
    transforms.append(ReversePermutation(features=2))
    transforms.append(MaskedAffineAutoregressiveTransform(features=2, 
                                                          hidden_features=4, 
                                                          context_features=1))
transform = CompositeTransform(transforms)

flow = Flow(transform, base_dist)

^ see the context_features kwarg to MaskedAffineAutoregressiveTransform? That is actually the conditional feature for the actual distribution transform.

Cheers, Miles

sitmo commented 3 years ago

Thanks Miles, that’s very helpful, I'm starting to better understand the nuts and bolts now!

MilesCranmer commented 3 years ago

Update: I'm not sure when #33 will get merged but you can install the fixed version with:

pip install -U git+https://github.com/MilesCranmer/nflows.git@dev

This should fix these errors. I'll try to keep it up-to-date with the master branch until things get merged.