Open PiotrDabkowski opened 2 years ago
Thank you...
I have mis-implemented this parameter...
I'll fix it right now.
Thanks again
self.alpha1 = nn.ParameterList([nn.Parameter(torch.ones(1, channels, 1)) for i in range(len(self.convs1))])
I changed alphas to ParameterList
https://github.com/sh-lee-prml/BigVGAN/blob/main/models_bigvgan.py#L51 https://github.com/sh-lee-prml/BigVGAN/blob/main/models_bigvgan.py#L52 https://github.com/sh-lee-prml/BigVGAN/blob/main/models_bigvgan.py#L100 https://github.com/sh-lee-prml/BigVGAN/blob/main/models_bigvgan.py#L102 https://github.com/sh-lee-prml/BigVGAN/blob/main/models_bigvgan.py#L108
Now, alpha is trainable 😢
Thank you again👍
@sh-lee-prml alpha need to be greater than 0 as a frequency parameter? maybe add a torch.exp(alpha) to make sure it positive?
Hi @HaiFengZeng
The official code of snake1d initializes it greater than 0 by abs()
a = torch.zeros_like(x[0]).normal_(mean=0,std=50).abs()
But, I think it does not need to be greater than 0. The negative alpha value just has an inverted phase, and we used the squared value as below
x = (x + (torch.sin(self.a * x) ** 2) / self.a)
I have noticed that alphas is not a ModuleList, but rather a list of parameters, this causes them to be not trainable.
Example: self.alpha1 = [nn.Parameter(torch.ones(1, channels, 1).to(rank)) for i in range(len(self.convs1))] Is this expected?