Open yynil opened 7 years ago
I think I found the root cause. When I changed back the original softmax implementation, the toy dataset is good to train and get 99.8% BLEU. I think I have to think about how to fix the memory leak issue in TensorFlow.
From the tutorial, I set up a experiment to play with the toy dataset. Since the issue of https://github.com/google/seq2seq/issues/224 , I just changed the softmax implementation. But even when I played with the toy data set, i cant get any reasonable results. The BLEU is always zero and the output is "1 1 1 1 ", they are totally meaningless.
I wanna know if this is because my version of TensorFlow is too new to reproduce the results ? I'm using the newest TF from the official site. Would you please tell me which version of TF are you using? Thanks a lot!