znxlwm / UGATIT-pytorch

Official PyTorch implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
MIT License
2.51k stars 476 forks source link

conv-adaILN-conv-adaILN, Why is there no activation function layer in the middle? #36

Open FangYang970206 opened 5 years ago

FangYang970206 commented 5 years ago

I found that there was no activation function layer where each block was connected. Such as conv--adaILN--conv--adaILN, Why do you do this? In order to achieve better performance?