howardyclo / papernotes

My personal notes and surveys on DL, CV and NLP papers.
128 stars 6 forks source link

MaskGAN: Better Text Generation via Filling in the ______ #15

Open howardyclo opened 6 years ago

howardyclo commented 6 years ago

Metadata

howardyclo commented 6 years ago

Summary

Neural text generation models are typically auto-regressive, trained via maximizing likelihood or perplexity and evaluate with validation perplexity. They claim such training and evaluation could result in poor sample quality (during sample generation, the model is often forced to condition on sequences that were never conditioned on at training time, leading to unpredictable dynamics in the hidden state of the RNN). Thus, they purpose to improve sample quality using actor-critic conditional GANs.

Related Work