Element-Research / rnn

Recurrent Neural Network library for Torch7's nn
BSD 3-Clause "New" or "Revised" License
941 stars 313 forks source link

Add SeqDropout #351

Closed hfxunlp closed 8 years ago

hfxunlp commented 8 years ago

I change the Dropout implentation of Torch/nn, get a dropout module that could be used for SeqLSTM, SeqGRU and SeqBRNN. the whole sequence will share the same mask which is diferrent from nn.Dropout.

hfxunlp commented 8 years ago

I am not good at both coding and english, I appreciate any kinds of your help.

JoostvDoorn commented 8 years ago

👍 I think this module does not need to be restricted to SeqLSTM, SeqGRU etc. Also it would be helpful if this module could support :remember('both').

For anyone interested why you want to do this, read: http://www.stat.berkeley.edu/~tsmoon/files/Conference/asru2015.pdf (edit: sorry for some reason I posted the wrong link here)

hfxunlp commented 8 years ago

@JoostvDoorn Thank you for your support. This module was considered to used before SeqRNN implementation, and I'm not clear how to implement remember in this module, I just used it like that: nn.Sequential():add(nn.SeqDropout()):add(nn.SeqGRU(128,128)) In my understanding, remember was a method of AbstractSequencer and maybe there should be some changes in SeqLSTM, SeqGRU etc, but sorry for I do not know how to make it work, the codes were to hard for me.

JoostvDoorn commented 8 years ago

That's okay, someone else (or me) should be able to implement remember for this fairly easily. Regarding it being only for SeqGRU etc., you can use it like this nn.Sequential():add(nn.SeqDropout(0.2)):add(nn.Sequencer(nn.GRU(128,128))). So we do not need to restrict this in the documentation.

hfxunlp commented 8 years ago

@JoostvDoorn Thank you for your help, I do not have enough experience in this area, your advise is very helpful for me.