williamleif / GraphSAGE

Representation learning on large graphs using stochastic graph convolutions.
Other
3.41k stars 841 forks source link

Dense layers use relu with xavier initialization #195

Open Saydemr opened 1 year ago

Saydemr commented 1 year ago

It is known that He initialization works better with ReLU since Xavier kill half of the input. So, in layers.py

self.vars['weights'] = tf.get_variable('weights', shape=(input_dim, output_dim),
                                         dtype=tf.float32, 
                                         initializer=tf.contrib.layers.variance_scaling_initializer(),
                                         regularizer=tf.contrib.layers.l2_regularizer(FLAGS.weight_decay))

JFR.