podgorskiy / ALAE

[CVPR2020] Adversarial Latent Autoencoders
3.51k stars 560 forks source link

MappingToLatent has no activation #44

Open eitanrich opened 4 years ago

eitanrich commented 4 years ago

Is it intentional that the D module (MappingToLatent) consists of three F.Linear layers w/o any activations (e.g. no ReLU / Leaky ReLU)?

https://github.com/podgorskiy/ALAE/blob/5d8362f3ce468ece4d59982ff531d1b8a19e792d/net.py#L894

podgorskiy commented 3 years ago

That's an error. But I won't change it, to be consistent with the published results. Most likely, the effect won't be significant, but I'm curious to see how the result will differ.

eitanrich commented 3 years ago

Maybe the behavior is related to Implicit Rank-Minimizing Autoencoder :)