macournoyer / neuralconvo

Neural conversational model in Torch
776 stars 347 forks source link

Limit the vocabulary size using word frequencies + remove LR decay #43

Open vikram-gupta opened 8 years ago

vikram-gupta commented 8 years ago

Hi @macournoyer ,

This is a small PR heavily inspired by the code changes done by @chenb67 in her fork and the discussions with you.

1) Commented out LR decay (Adam would be taking care of this) 2) Limit the vocabulary size on basis of the frequencies of the word

The training is going on with these changes on a small dataset. Will update the results soon.

Have tried to keep the code changes similar to @chenb67 , so that we do not face too many conflicts while merging.

-- Update -- Since the dataset and vocabulary was small, some of the responses are not as good as before, but overall the results are satisfactory ! The perplexity kept on decreasing with epoch and came down to around 2.76 after 50 epochs.

Results: th train.lua --cuda --dataset 25000 --hiddenSize 1000 --maxVocabSize 10000

you> Hi! neuralconvo> Yes? you> Hello? neuralconvo> Hey. you> How are you? neuralconvo> I need to know you, hildy. you> What's your name? neuralconvo> Smith. you> How so? neuralconvo> He was uptown when i it, but saw i don't have to you> What is the purpose of life? neuralconvo> My baby daughter. you> Are you intelligent? neuralconvo> No. you> Are you a machine? neuralconvo> Yeah. you> Are you human? neuralconvo> No sir. you> What are you? neuralconvo> It's the maintenance entrance. you> Do you plan on taking over the world? neuralconvo> Yes. you> Hi ! neuralconvo> Yes? you> How is your brother? neuralconvo> Terrible. you> Are you married? neuralconvo> No sir.