netrome / DeepGeneration

Generation of synthetic data sets using generative adversarial neural networks
0 stars 0 forks source link

AEGAN #22

Open netrome opened 6 years ago

netrome commented 6 years ago

Sort of adversarial autoencoder. Train a normal autoencoder (with constraints on the latent space to prevent drifting) simultaneously with a normal GAN (or perhaps an LSGAN?).

netrome commented 6 years ago

Working on AEGAN, I want to test sampling from a uniform distribution and add a translated strong ReLU loss on the autoencoder to constrain the latent code to this space. I should try training the autoencoder with this loss first.

netrome commented 6 years ago

Normal distributions are nicer than uniform distributions. Idea: Penalize autoencoder points with (some normalized) inverse probability density.

netrome commented 6 years ago

Another great thing about this AEGAN: The reconstruction error can be used as a measure of convergence.

netrome commented 6 years ago

The convergence time for this algorithm seems high. If the algorithms fail to encode all relevant information (especially sharp edges), the adversarial and GAN-loss are at risk of counteracting eachother