Open bmigette opened 6 years ago
LSTM is bit tricky and I am in need of support from CNTK team. If you know any developers who understand RNN very well please please request him/her to contribute.
I know how theorie goes for RNN And dropout, however I am not familiar with CNTK At all...
Maybe this could help: https://stackoverflow.com/questions/44924690/keras-the-difference-between-lstm-dropout-and-lstm-recurrent-dropout
https://pdfs.semanticscholar.org/3061/db5aab0b3f6070ea0f19f8e76470e44aefa5.pdf
http://www.aclweb.org/anthology/C16-1165
If you have support from CNTK Team, maybe worth to ask them if there's some example available.
I have asked for help. Will wait and hopefully can implement soon.
Good stuff !
Any update on this ?
Hello,
Would be great to consider adding ecurrent_dropout and recurrent_regularizer parameters to LSTM Layer See: https://keras.io/layers/recurrent/#lstm