Hey, thank you for your excellent work and shared code. Before contextual attention in inpainting_model.py, the gated convolution layer uses the ReLU activation function instead of the ELU. Is there any benefit to doing this?
x = gen_conv(x, 4*cnum, 3, 1, name='pmconv6', activation=tf.nn.relu)
Hey, thank you for your excellent work and shared code. Before contextual attention in inpainting_model.py, the gated convolution layer uses the ReLU activation function instead of the ELU. Is there any benefit to doing this?
x = gen_conv(x, 4*cnum, 3, 1, name='pmconv6', activation=tf.nn.relu)