tensorflow / models

Models and examples built with TensorFlow
Other
77.18k stars 45.75k forks source link

[failure]textsum: fail to generate any meaningful headline shown in the Examples #1515

Closed ghost closed 7 years ago

ghost commented 7 years ago

I have run the seq2seq_attention.py and the average loss reduce to about 0.5. And I run it in decode mode. But the output are filled of UNK symbols, instead of those meaningful ones shown in the examples. BTW, I trained the model using the toy example provided by you. I wonder why can't I get what you get, even I run the same model and with same data set. Did you get those examples by training with the WHOLE data set or with only the part you provide? Thanks in advance!

michaelisard commented 7 years ago

This question is better asked on StackOverflow since it is not a bug or feature request. There is also a larger community that reads questions there. Thanks!

anthnyprschka commented 7 years ago

Training with the toy data will give you no useful results. One reason for this is that the toy vocabulary only consists of ~10k words, which will render many out-of-vocabulary words in the toy dataset to be substituted by tokens