torch / tutorials

A series of machine learning tutorials for Torch7
622 stars 313 forks source link

5_newmodules: gradient checking on the Dropout is showing large error #25

Closed varun-invent closed 9 years ago

varun-invent commented 9 years ago

I added the Dropout layer (that is created in these tutorials) in my network and then created a code to check the gradient using finite differences. Without the Dropout layer, the difference is very less(order of 10e-6), but as I add the Dropout layer, the error increased a lot (0.9). I feel there is some error in dropout you implemented. I am a newbie in torch and i am trying to learn torch using https://github.com/oxford-cs-ml-2015/practical4 .

soumith commented 9 years ago

I dont think there's an error in the implemented dropout. The gradient tests (of nn.Jacobian) dont run in the order you expect. They might run updateOutput several times, followed by several runs of updateGradInput. The dropout vector will not be correctly assigned to each run. You might have to write your own gradient checks for this and I have a fairly high confidence that it passes.