tolstikhin / wae

Wasserstein Auto-Encoders
BSD 3-Clause "New" or "Revised" License
508 stars 91 forks source link

Repository info

This project implements an unsupervised generative modeling technique called Wasserstein Auto-Encoders (WAE), proposed by Tolstikhin, Bousquet, Gelly, Schoelkopf (2017).

Repository structure

wae.py - everything specific to WAE, including encoder-decoder losses, various forms of a distribution matching penalties, and training pipelines

run.py - master script to train a specific model on a selected dataset with specified hyperparameters

Example of output pictures

The following picture shows various characteristics of the WAE-MMD model trained on CelebA after 50 epochs:

WAE-MMD progress