tkipf / relational-gcn

Keras-based implementation of Relational Graph Convolutional Networks
MIT License
793 stars 134 forks source link

Double Dropout is used #8

Open Phutoast opened 5 years ago

Phutoast commented 5 years ago

Dropout is used in GCN layer by featureless=True, then is directly applied to this output, resulting in double dropout?

Inside train.py

H = GraphConvolution(HIDDEN, support, num_bases=BASES, featureless=True,
                     activation='relu',
                     W_regularizer=l2(L2))([X_in] + A_in)
H = Dropout(DO)(H)

Inside class GraphConvolution

if self.featureless:
  tmp = K.ones(self.num_nodes)
  tmp_do = Dropout(self.dropout)(tmp)
  output = (output.T * tmp_do).T
Phutoast commented 5 years ago

I am aware that the second dropout is on the node, while the first one is on features. Is there any intuition behind this ?