LantaoYu / SeqGAN

Implementation of Sequence Generative Adversarial Nets with Policy Gradient
2.09k stars 710 forks source link

About pre-training generator #34

Open playma opened 7 years ago

playma commented 7 years ago

Excuse me, I have some questions. If I want to train SeqGan on the real data like natural text(poem, article), I can not pre-train the generator because I can not know the distribution of the text right and I don't have targer_lstm params?

EternalFeather commented 6 years ago

@playma I think target_lstm is just a model for us to generate something like "toy-corpus" in this experiment. As the paper said, it used this model to generate some token sets as real data. So if you want to pre-train with real data like natural text, you don't really need target_lstm model.