suragnair / seqGAN

A simplified PyTorch implementation of "SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient." (Yu, Lantao, et al.)
642 stars 149 forks source link

Can Seq2Seq model be put into Generator on the conversation? #6

Closed edlin0249 closed 6 years ago

edlin0249 commented 6 years ago

Hi, I want to modify your code to apply to conversation task. I will put Seq2Seq model into Generator. Does it make sense? I am confused with the fact that Seq2Seq produces its own response of input sentence given the input sentence and the target sentence, but the GAN produces the rational sentence given any sample. I discussed with my professor, but she said about the above discription. Hence, I ask you the possibility. Thanks Best Regards, Ying-Ting Lin

suragnair commented 6 years ago

For conversation, you can provide data samples without an oracle generator. The oracle generator is only used in the experiments to show the usefulness of the GAN training stage. For conversation data, you don't have an oracle model (because that's what you are trying to find).

So you will only have to rely on the statistics provided by the generator you are training.