Experiment Code for Paper ``CoT: Cooperative Training for Generative Modeling of Discrete Data''
We propose a new paradigm of algorithm for training tractable explicit density generative models like RNN language models.
The research paper CoT: Cooperative Training for Generative Modeling of Discrete Data is now available on arXiv and has been accepted by ICML 2019 as a conference paper.
We reproduce example codes to repeat the synthetic Turing test experiment with evaluations of NLLtest, NLLoracle, balanced NLL and JSD(P || G) by the oracle model.
$ python3 cot.py
Start Cooperative Training...
batch: 0 nll_oracle 11.429975
batch: 0 nll_test 8.524782
cooptrain epoch# 0 jsd 8.365606
batch: 100 nll_oracle 10.475937
batch: 100 nll_test 7.9382834
cooptrain epoch# 1 jsd 7.330582
batch: 200 nll_oracle 10.38681
batch: 200 nll_test 7.868909
... ...