mdenil / dropout

A theano implementation of Hinton's dropout.
MIT License
144 stars 58 forks source link

Do all the weights multiply the included probability p during testing? #15

Closed dzhang22 closed 8 years ago

dzhang22 commented 8 years ago

I didn't find the code which de-weight model parameters by p during testing, as the original paper suggests. Is this correct or am I missing something?

mdenil commented 8 years ago

Reweighting happens here: https://github.com/mdenil/dropout/blob/master/mlp.py#L130

The reweighted pathway through the theano graph leads to negative_log_likelihood and errors here: https://github.com/mdenil/dropout/blob/master/mlp.py#L160 .

This pathway is used to construct the test function here: https://github.com/mdenil/dropout/blob/master/mlp.py#L240