oswaldoludwig / Seq2seq-Chatbot-for-Keras

This repository contains a new generative model of chatbot based on seq2seq modeling.
Apache License 2.0
331 stars 98 forks source link

Weights loading fails #8

Closed EgorGumin closed 6 years ago

EgorGumin commented 6 years ago

Hi! Trying to launch your example, but facing an error with loading weights:

Traceback (most recent call last):
  File "conversation.py", line 160, in <module>
    model.load_weights(weights_file)
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py", line 2367, 
in load_weights  ' elements.')
Exception: Layer #3 (named "Encode answer up to the current token" in the current model) 
was found to correspond to layer lstm_1 in the save file. 
However the new layer Encode answer up to the current token expects 12 weights, 
but the saved weights have 3 elements.
oswaldoludwig commented 6 years ago

Are you using Theano backend? "This project was done in Linux, Python 2.x, Theano 0.9.0, and Keras 1.0.4. The use of another configuration may require some minor adaptations".

EgorGumin commented 6 years ago

Yes, I'm using Python 2.7.12, Theano 0.9.0 and Keras 1.0.4 on Ubuntu 16.04

oswaldoludwig commented 6 years ago

if you keep these parameters the code should run smoothly: word_embedding_size = 100 sentence_embedding_size = 300 dictionary_size = 7000 maxlen_input = 50

EgorGumin commented 6 years ago

I did not modified your conversation.py example, so I have exactly the same params, but the code fails with an error mentioned above. I think an error might be caused by incorrect weights file, but I'm not sure

oswaldoludwig commented 6 years ago

Can you check the new model conversation_discriminator.py?

EgorGumin commented 6 years ago

The same result

Using Theano backend.
Starting the model...

CHAT:

computer: hi ! please type your name.
user: Egor
computer: hi , Egor ! My name is john.
user: Hi

Traceback (most recent call last):
  File "conversation_discriminator.py", line 279, in <module>
    model.load_weights(weights_file)
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py", line 2367, 
in load_weights ' elements.')
Exception: Layer #3 (named "Encode answer up to the current token" in the current model) 
was found to correspond to layer lstm_1 in the save file. However the new layer 
Encode answer up to the current token expects 12 weights, but the saved weights have 3 elements.
oswaldoludwig commented 6 years ago

Now I have no clue. If you find a workaround for your issue, let us know, you can help other users cope with similar issues. Otherwise, you can delete the weight file and train the model from the scratch (see the readme file). Using GPU you will have a chatbot able to chat with you in less than one hour of training.

ascherbakhov commented 6 years ago

I have this issue too. Ubuntu 16.04, Python 2.7, Keras 1.0.4, Theano 0.9.0. Can you please rerun conversation.py with your weights file on your machine? Also i have import error on "from keras.utils import plot_model". As I know, "plot_model" function was moved to other module and in keras 1.0.4 "plot" function was in keras.utils.visualize_util. I'm afraid, we have different keras versions.

p.s. I got keras with "pip install keras==1.0.4"

oswaldoludwig commented 6 years ago

I have here Ubuntu 14.04.3 LTS, Python 2.7.6, Theano 0.9.0, and Keras 2.0.4.

oswaldoludwig commented 6 years ago

I updated the files conversation.py and conversation_discriminator.py. Now both codes work without problems and no warnings on my machine with Keras 2.0.4. Please let me know if this has solved your problem (remember to install Keras 2.0.4).