Closed kayuksel closed 3 years ago
Hi @kayuksel. I would suggest the Neural Spline Flow line of work for a good trade-off between scalability and expressivity. I'm not sure if scaling to 100k dimensions is possible out-of-the-box. This is not really related to torchdyn
, but you might want to checkout Pyro's implementation. Pyro is a robust and well-tested library for density estimation, generative modeling and probabilistic programming in general.
I am working on a project where I sample a set of n-dimensional points from a Gaussian distribution (of learnt parameters) as follows and then evaluate those points based on a loss function to update model parameters with gradient descent.
I would like to transform the Gaussian distribution for being able to sample those points from a more complex learnt distribution. In other words, the model needs to learn how to best transform points obtained from the Gaussian distribution.
I would be glad if you can suggest the best normalizing flows method (transform) to employ considering the following scalability requirements (whether or not it is available in this repo). Thank you very much in advance for your suggestion.