Closed ghost closed 8 years ago
Using same PTB files as per github page. Training, testing, and sampling run in cpu mode but results are still "Awful: Nnet rejected" after using suggested settings in RESULTS.md.
Maybe a GPU based system would improve output.
Hi:
I've been experimenting with faster-rnnlm using karpthy's text data from char-rnn and other public domain texts but I can't move beyond "Awful: Nnet rejected"
I've tried changing gru etc. and lowering values as per your previous reply but the results are always the same. The program exits after 3 passes and the resulting .Nnet file is, I assume, of poor quality based on the sampled output.
Is there a known training corpus that works well with faster-rnnlm? Any links and additional information is appreciated.
Cheers