issues
search
mdenil
/
dropout
A theano implementation of Hinton's dropout.
MIT License
144
stars
58
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Why set the W by this formula W=layer.W / (1 - dropout_rates[layer_counter]) in testing?
#16
BayronP
closed
6 years ago
1
Do all the weights multiply the included probability p during testing?
#15
dzhang22
closed
8 years ago
1
Dropout rate should be set to 0 if not using dropout
#14
droid666
opened
9 years ago
3
dropout trainig doesn't work with over 3 hiddent layers
#13
winstonquock
opened
9 years ago
0
Difference between 'dropout' and 'backprop' arguements in script
#12
saatvikshah
closed
9 years ago
5
decrease total output, incl. the bias component
#11
dgoldman-pdx
closed
9 years ago
4
About the Resample Issue
#10
sephirothvs
closed
10 years ago
1
dropping output units rather than connections
#9
lzamparo
closed
10 years ago
1
add several functionalities to mlp.py
#8
ChenglongChen
closed
10 years ago
3
Momentum again
#7
ChenglongChen
closed
10 years ago
2
Random dropout at each mini-batch?
#6
ChenglongChen
closed
10 years ago
8
Constrain weight matrix columns instead of rows
#5
mdenil
closed
10 years ago
1
Momentum bug
#4
mdenil
closed
11 years ago
0
Incorrect weight scaling on inputs
#3
mdenil
closed
11 years ago
0
License
#2
az0
closed
12 years ago
1
no bias in mlp.py
#1
rasmuspjohansson
closed
12 years ago
2