jw9730 / setvae

[CVPR'21] SetVAE: Learning Hierarchical Composition for Generative Modeling of Set-Structured Data, in PyTorch
MIT License
68 stars 13 forks source link

Why is residual ignored for first layer #4

Closed js0nwu closed 2 years ago

js0nwu commented 2 years ago

Hello, thank you for releasing the code for your SetVAE paper.

I had a question about this line: https://github.com/jw9730/setvae/blob/7fdf4c6633cc69015bcd688383679699debc24ab/models/networks.py#L205

For the first decoder block, why is the residual from the hidden state ignored?

Thanks

jw9730 commented 2 years ago

Hi ArkaneCow,

Please check Eq. (11) and Eq. (12) in the paper. There is no conditional dependency from z^(0) to z^(1) because z^(1) is designed to be the most coarse invariant latent variable that dictates the modulation of equivariant latent variable z^(0).

js0nwu commented 2 years ago

Thanks for the quick response. That makes sense. I also notice now that figure 3b shows the same setup as the code.