Open Phutoast opened 5 years ago
Dropout is used in GCN layer by featureless=True, then is directly applied to this output, resulting in double dropout?
featureless=True
Inside train.py
train.py
H = GraphConvolution(HIDDEN, support, num_bases=BASES, featureless=True, activation='relu', W_regularizer=l2(L2))([X_in] + A_in) H = Dropout(DO)(H)
Inside class GraphConvolution
GraphConvolution
if self.featureless: tmp = K.ones(self.num_nodes) tmp_do = Dropout(self.dropout)(tmp) output = (output.T * tmp_do).T
I am aware that the second dropout is on the node, while the first one is on features. Is there any intuition behind this ?
Dropout is used in GCN layer by
featureless=True
, then is directly applied to this output, resulting in double dropout?Inside
train.py
Inside class
GraphConvolution