shrubb / latent-pose-reenactment

The authors' implementation of the "Neural Head Reenactment with Latent Pose Descriptors" (CVPR 2020) paper.
https://shrubb.github.io/research/latent-pose-reenactment/
Apache License 2.0
180 stars 34 forks source link

Questions about architecture details. #5

Closed kangyeolk closed 3 years ago

kangyeolk commented 3 years ago

First, thank you for your awesome work! It is very helpful to me. I have two questions regarding the architectures.

  1. What is the effect of moving the range as below in generator?

https://github.com/shrubb/latent-pose-reenactment/blob/59629a64105c7c33fa01c461a3c65d3690f8533c/generators/vector_pose_unsupervised_segmentation_noBottleneck.py#L172-L174

  1. There is no norm_layer in resblocks of discriminator, could you give me a reason for that? I think it's unusual.

Again, thanks!

shrubb commented 3 years ago

고맙습니다, 강열씨!

  1. This is to stay a little bit safer from vanishing gradients. Consider a pixel where the ground truth value is 1.0. Then, without such range shift, the network will be trained to emit a large positive output value so that tanh(output) ≈ 1.0. The possible danger is that output can happen to be so large that the gradient of tanh at output will be too close to zero, hampering the training. After moving the range, gradients at extreme ground truth values become adequate while the output values are still limited.

    I've been using this trick since my very first experiments. If you remove it, I'm sure everything will still work. After all, many people successfully train generative nets with plain tanh. But I just haven't checked.

    Even more, I think you can even use plain linear activation (no sigmoid, tanh or anything) instead of this trick, and everything will still work.

  2. It worked in Zakharov et al. So we borrowed it as-is from there to save on experiments. Today we realize that your concern is indeed quite reasonable, and that BNs or INs are likely to improve the system.

kangyeolk commented 3 years ago

Thank you for your kind answer!