llSourcell / tensorflow_chatbot

Tensorflow chatbot demo by @Sirajology on Youtube
1.45k stars 806 forks source link

Useless output at 42500 steps #48

Open yochrisbolton opened 7 years ago

yochrisbolton commented 7 years ago

I'm currently using python3 with https://github.com/llSourcell/tensorflow_chatbot/issues/46

`Reading model parameters from working_dir/seq2seq.ckpt-42500

hello

how are you? I ' m not . you're not? No . what colo is the sky? _UNK . who are you? I ' m not . Whats your name? _UNK . `

This is my last eval

global step 42500 learning rate 0.0383 step-time 0.46 perplexity 18.02
eval: bucket 0 perplexity 1329.99
eval: bucket 1 perplexity 1946.94
eval: bucket 2 perplexity 1677.25
eval: bucket 3 perplexity 1582.53

With settings

# number of LSTM layers : 1/2/3
num_layers = 1
# typical options : 128, 256, 512, 1024
layer_size = 128
learning_rate = 0.09

This out put seems useless. I see that there is a python3 https://github.com/llSourcell/tensorflow_chatbot/issues/22

Is it worth attempting the python3 fix along with the TF 1.2 fix? Or is my best bet to run Tensorflow 0.12 in my python 2+ environment?

charlestruluck commented 7 years ago

@finchMFG share your whole seq2seq.ini?

brave3d commented 7 years ago

Same here I ' m not. , _UNK . _UNK . _UNK . Maybe we have to wait until perplexity == 1, I worried about my laptop so I stopped the training 😅

Rabisha commented 6 years ago

I have trained until perplexity == 1, but even i got useless output.What is the problem over here?

lohith-emplay commented 6 years ago

Did anyone get a fix for this UNK problem?