tensorflow / nmt

TensorFlow Neural Machine Translation Tutorial
Apache License 2.0
6.37k stars 1.96k forks source link

During GNMT 8LSTM training, can we train on full paragraphs, or must we break it to sentences? #236

Closed ndvbd closed 6 years ago

ndvbd commented 6 years ago

During GNMT 8LSTM training, can we train on full paragraphs (from source language to target language), or must we break it to sentences?

Can the model be trained well to cope will full paragraphs?

lmthang commented 6 years ago

In our experiments, we can train on longer sequences, e.g., lengths 200-400 (though depending on your model sizes, you might run out of memory). Training on long sequences will also take longer time. Running experiments is the best way to answer your question :)

ndvbd commented 6 years ago

Thanks. I found it beneficial to train on around 3 sentences, in order to provide some beyond-sentence context to the model.

lmthang commented 6 years ago

Thanks for the info! Closing this for now.