reppy4620 / Dialog

A PyTorch Implementation of japanese chatbot using BERT and Transformer's decoder
MIT License
72 stars 29 forks source link

Nice work ! #10

Closed ycat3 closed 4 years ago

ycat3 commented 4 years ago

I added another Japanese around 2000 lines data to train more. Also I added voice reply functionality. https://jweb.asia/26-it/ai/51-bert-chatbot.html

reppy4620 commented 4 years ago

Thank you so much for your great work and sorry for not good quality of generated conversation.

I think train data has biases due to the data collected with queries such as "おはよう", "疲れ", "それな" etc.... This is one of the reason that model has low diversities i.e. could not generate variety utterances and long sequences. Of course, model architecture and maximizing likelihood with cross-entropy loss are too.

By the way, can I mention the link of your work in README?

Thank you.

ycat3 commented 4 years ago

Thank you quick reply. Of course, you can link my work , and contact me since i am living in Yokohama.

reppy4620 commented 4 years ago

I'm grad to hear that. I'll update README if I have free time. Thanks again.