vsitzmann / siren

Official implementation of "Implicit Neural Representations with Periodic Activation Functions"
MIT License
1.74k stars 247 forks source link

Question about the lemma 1.6 and code implementation #10

Open fishfishson opened 4 years ago

fishfishson commented 4 years ago

Hi. In your paper the lemma 1.6 is talking about the distribution of X and Y=sin(pi/2X). We have known that the output of linear layer is in normal distribution if taking specific initialization method uniform(-c,c). However, in your code, the activation is just torch.sin(x) not torch.sin(pi/2 x). Is there something I missed?

wtyuan96 commented 2 years ago

Hi. In your paper the lemma 1.6 is talking about the distribution of X and Y=sin(pi/2X). We have known that the output of linear layer is in normal distribution if taking specific initialization method uniform(-c,c). However, in your code, the activation is just torch.sin(x) not torch.sin(pi/2 x). Is there something I missed?

Same question, do you have any idea now?