graykode / nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers
https://www.reddit.com/r/MachineLearning/comments/amfinl/project_nlptutoral_repository_who_is_studying/
MIT License
14.03k stars 3.9k forks source link

how to use seq2seq(attention) for multiple batch #46

Open jbjeong91 opened 4 years ago

jbjeong91 commented 4 years ago

how to use seq2seq(attention) for multiple batch

wmathor commented 4 years ago

maybe you can see this code https://github.com/wmathor/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.ipynb