Element-Research / rnn

Recurrent Neural Network library for Torch7's nn
BSD 3-Clause "New" or "Revised" License
938 stars 314 forks source link

Reset flag in Dropout to prevent bug if the state is cleared #413

Closed jbboin closed 7 years ago

jbboin commented 7 years ago

Calling clearState() on a lazy Dropout layer doesn't reset the flag, which means that the noise isn't generated again once it's called with a new input. This causes a crash as follows:

d = nn.Dropout(0.5,false,false,true)
input = torch.randn(2, 3)
d:forward(input)
d:clearState()
d:forward(input)

The simple fix is to reset flag to true when calling clearState().