Closed Vincent717 closed 8 years ago
Any word that has been trimmed from the vocabulary (using --maxVocabSize
) is replaced w/ <unknown>
.
You probably chose a --maxVocabSize
that is too big.
I didn't set --maxVocabSize when training actually, but good news is, I have retry with a small model with 100 data and it can reply again..
Can I assume that there may be some problem with the vocab.t7
in my previous try?
And thank you anyway!
Btw, one more quick question, what is the meaning of "Epoch"? Since like the traing is different in different epoch, do they use diffrent data?
Ah yes! You have to remove data/*.t7 if you change training params.
Epoch is one training iteration over the whole dataset.
That may be the problem!
I see! Thanks
Hi all,
I have train a model for some days, with 50000 datasize, everything is fine, until I try to eval it. And I got:
you> Hi
neuralconvo> <unknown>.
you> How are you?
neuralconvo> <unknown>.
you> What is your name?
neuralconvo> <unknown>.
Every response of it is
<unknown>
. This is wired and I cannot find the reason, because there is no error occur so far. So do you have any thoughts on this?Thanks Vincent