karpathy / convnetjs

Deep Learning in Javascript. Train Convolutional Neural Networks (or ordinary ones) in your browser.
MIT License
10.9k stars 2.04k forks source link

Scaling dropout layer by keep probability during test time #106

Closed Chris-Nicholls closed 5 years ago

Chris-Nicholls commented 5 years ago

From the dropout paper http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf :

If a unit is retained with probability p during training, the outgoing weights of that unit are multiplied by p at test time

The test activations should be scaled by (1-drop_prob), not drop_prob. For example, if drop prob is 0, this layer should have no effect and we should scale activations by 1.

ratajczak commented 5 years ago

Hi Chris, it looks like duplicate of #61

Chris-Nicholls commented 5 years ago

Yup, you're right. Looks like this isn't being maintained anyway.