gau820827 / AI-writer_Data2Doc

PyTorch Implementation of NBA game summary generator.
Apache License 2.0
83 stars 22 forks source link

Error in using small_evaluate #52

Closed cmuspencerlo closed 6 years ago

cmuspencerlo commented 6 years ago

Hi,

When I use python3 small_evaluate.py, I can see the above error message:

KeyError: 'unexpected key "LocalEncoder.embedding.embedding1.weight" in state_dict' Any idea on how to solve this problem?

Thanks.

weikaipan commented 6 years ago

Hi, Thanks for letting us know. Can you provide the type of local encoder you used? The error may be caused by inconsistent number of layers.

For example, if you use "EncoderLIN" as encoder, then in settings.py, the parameter LAYER_DEPTH should always be 1. If your saved model has layer depth = 3, then in settings.py, the LAYER_DEPTH must be 3 as well.

Thanks.

cmuspencerlo commented 6 years ago

Hi,

Thanks for a quick reply :)

Here is my scenario: I use python3 train.py without any modification to freeze the weights in local directory. And I directly use python3 small_evaluate.py to get some quick results. In the whole training and evaluation process, I only use the default parameters in setting.py.

In this case, I am wondering what kind of factor would lead to inconsistency.

weikaipan commented 6 years ago

Hi,

Thanks for providing the scenario :) I realized that the error and inconsistency are at our end. Sorry about this error. The version on our repo is not the latest one, and our team plan to update the version during the following days. But I can provide the reason of this error.

The reason for this inconsistency is the default setting of ENCODER_STYLE in settings.py is HierarchicalRNN, which is not any type from line 22 to 30 of small_evaluate.py. So the evaluation script will expect a linear encoder as input. However, the encoder of a saved model is an RNN encoder by default after running train.py. So the error was raised.

Thank you for letting us know again, and we'll update our master branch soon.

cmuspencerlo commented 6 years ago

That makes sense.

Thanks for the effort!