tonyduan / normalizing-flows

Neural Spline Flow, RealNVP, Autoregressive Flow, 1x1Conv in PyTorch.
MIT License
271 stars 38 forks source link

Requesting Advice on NF Methods #8

Closed kayuksel closed 3 years ago

kayuksel commented 3 years ago

I am working on a project where I sample a set of n-dimensional points from a Gaussian distribution (of learnt parameters) as follows and then evaluate those points based on a loss function to update model parameters with gradient descent.

mu, std = self.lin_1(z), self.lin_2(z)
eps = torch.Tensor(*img_shape).normal_()
return self.act((eps.cuda() * std) + mu)

I would like to transform the Gaussian distribution for being able to sample those points from a more complex learnt distribution. In other words, the model needs to learn how to best transform points obtained from the Gaussian distribution.

I would be glad if you can suggest the best normalizing flows method (transform) to employ considering the following scalability requirements (whether or not it is available in this repo). Thank you very much in advance for your suggestion.

tonyduan commented 3 years ago

Personally I don't have much experience with scalable normalizing flow methods, so I am perhaps not the most suited to answer your question.

That said I'd start with something simple like stacking RealNVP layers composed with a base network of your choice. Notably [OpenAI's Glow paper] essentially consists of blocks of RealNVP layers (along with ActNorm and 1x1 Convolutions).