deeppavlov / DeepPavlov

An open source library for deep learning end-to-end dialog systems and chatbots.
https://deeppavlov.ai
Apache License 2.0
6.67k stars 1.14k forks source link

Hard-coded context length in DAM ranking model? #978

Closed kauttoj closed 4 years ago

kauttoj commented 5 years ago

I'm training a DAM neural ranking model following the example for Ubuntu V2 dataset. I want to repeat the same analysis with my own data with varying context length size. However, I think there is a problem in line 99 of "deep_attention_matching_network.py". The call

super(DAMNetwork, self).__init__(*args, **kwargs)

does not allow custom context lengths, but always reverts to the default 10 (in "tf_base_matching_model.py" line 46). The correct behavior (at least no errors) is achieved by replacing above call with:

super(DAMNetwork, self).__init__(num_context_turns=num_context_turns,*args, **kwargs)

Could you check if this is a bug in the code. Or maybe there is something missing in "ranking_ubuntu_v2_mt_word2vec_dam.json" config file.

puleon commented 4 years ago

Yes, this is a bug. Thank you. We will fix it in the nearest release.