Closed dzhang22 closed 8 years ago
Reweighting happens here: https://github.com/mdenil/dropout/blob/master/mlp.py#L130
The reweighted pathway through the theano graph leads to negative_log_likelihood
and errors
here: https://github.com/mdenil/dropout/blob/master/mlp.py#L160 .
This pathway is used to construct the test function here: https://github.com/mdenil/dropout/blob/master/mlp.py#L240
I didn't find the code which de-weight model parameters by p during testing, as the original paper suggests. Is this correct or am I missing something?