rpryzant / delete_retrieve_generate

PyTorch implementation of the Delete, Retrieve Generate style transfer algorithm
MIT License
132 stars 26 forks source link

What's the point of seq2seq model in your code? #9

Closed wusj18 closed 5 years ago

wusj18 commented 5 years ago

I wonder why you add a seq2seq mode in your model, Which method does it correspond to in the paper? Thank you very much

rpryzant commented 5 years ago

Seq2seq doesn't correspond to any method in the paper. Seq2seq is the "lstm encoder/decoder with attention" model that all of these style transfer algorithms build off of. I think the proper citation for it would be this: https://arxiv.org/abs/1508.04025

Seq2seq is included in this package because (1) all of the other models use this one, so might as well expose it to the user, and (2) I originally wrote this for my own research purposes and wanted to use the standard seq2seq as well.

wusj18 commented 5 years ago

Thank you very much for your detailed explanation!:-)