SciSharp / SiaNet

An easy to use C# deep learning library with CUDA/OpenCL support
https://scisharp.github.io/SiaNet
MIT License
380 stars 83 forks source link

Add recurrent_dropout and recurrent_regularizer to LSTM Layer #29

Open bmigette opened 6 years ago

bmigette commented 6 years ago

Hello,

Would be great to consider adding ecurrent_dropout and recurrent_regularizer parameters to LSTM Layer See: https://keras.io/layers/recurrent/#lstm

deepakkumar1984 commented 6 years ago

LSTM is bit tricky and I am in need of support from CNTK team. If you know any developers who understand RNN very well please please request him/her to contribute.

bmigette commented 6 years ago

I know how theorie goes for RNN And dropout, however I am not familiar with CNTK At all...

Maybe this could help: https://stackoverflow.com/questions/44924690/keras-the-difference-between-lstm-dropout-and-lstm-recurrent-dropout

https://pdfs.semanticscholar.org/3061/db5aab0b3f6070ea0f19f8e76470e44aefa5.pdf

http://www.aclweb.org/anthology/C16-1165

If you have support from CNTK Team, maybe worth to ask them if there's some example available.

deepakkumar1984 commented 6 years ago

I have asked for help. Will wait and hopefully can implement soon.

bmigette commented 6 years ago

Good stuff !

bmigette commented 6 years ago

Any update on this ?