Wolf is an open source library for Invertible Generative (Normalizing) Flows.
This is the code we used in the following papers
Decoupling Global and Local Representations via Invertible Generative Flows
Xuezhe Ma, Xiang Kong, Shanghang Zhang and Eduard Hovy
ICLR 2021
MaCow: Masked Convolutional Generative Flow
Xuezhe Ma, Xiang Kong, Shanghang Zhang and Eduard Hovy
NeurIPS 2019
First go to the experiments directory:
cd experiments
Training a new CIFAR-10 model:
python -u train.py \
--config configs/cifar10/glow-gaussian-uni.json \
--epochs 15000 --valid_epochs 10
--batch_size 512 --batch_steps 2 --eval_batch_size 1000 --init_batch_size 2048 \
--lr 0.001 --beta1 0.9 --beta2 0.999 --eps 1e-8 --warmup_steps 50 --weight_decay 1e-6 --grad_clip 0 \
--image_size 32 --n_bits 8 \
--data_path <data path> --model_path <model path>
The hyper-parameters for other datasets are provided in the paper.
distributed.py
or slurm.py
, and
refer to the pytorch distributed parallel training tutorial.We also implement the MaCow model with distributed training supported. To train a new MaCow model, please use the MaCow config files for different datasets.
@InProceedings{decoupling2021,
title = {Decoupling Global and Local Representations via Invertible Generative Flows},
author = {Ma, Xuezhe and Kong, Xiang and Zhang, Shanghang and Hovy, Eduard},
booktitle = {Proceedings of the 9th International Conference on Learning Representations (ICLR-2021)},
year = {2021},
month = {May},
}
@incollection{macow2019,
title = {MaCow: Masked Convolutional Generative Flow},
author = {Ma, Xuezhe and Kong, Xiang and Zhang, Shanghang and Hovy, Eduard},
booktitle = {Advances in Neural Information Processing Systems 33, (NeurIPS-2019)},
year = {2019},
publisher = {Curran Associates, Inc.}
}