Open redhairedcelt opened 3 years ago
Develop a Word Level Neural Language Model: https://machinelearningmastery.com/how-to-develop-a-word-level-neural-language-model-in-keras/
Standard two layer LSTM, fully connected hidden layer.
Bi-Direction RNNs: https://www.analyticsvidhya.com/blog/2019/01/sequence-models-deeplearning/
Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling: https://arxiv.org/pdf/1412.3555v1.pdf
GRU and LSTM beat out traditional RNN with tanh