JiahuiYu / generative_inpainting

DeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
http://jiahuiyu.com/deepfill/
Other
3.24k stars 781 forks source link

Inpainting_model.py uses the ReLU activation function before the contextual attention. #474

Closed huoxiangzuo closed 3 years ago

huoxiangzuo commented 3 years ago

Hey, thank you for your excellent work and shared code. Before contextual attention in inpainting_model.py, the gated convolution layer uses the ReLU activation function instead of the ELU. Is there any benefit to doing this? x = gen_conv(x, 4*cnum, 3, 1, name='pmconv6', activation=tf.nn.relu)

JiahuiYu commented 3 years ago

Ah I forgot why.. Maybe you can do some ablation studies on that.