State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
12.94k
stars
3.12k
forks
source link
[FastPitch] unused parameters in attention module #1310
Related to Model/Framework(s) SpeechSynthesis/FastPitch
Describe the bug There is a convolutional layer
attn_proj
that is declared, but unused anywhere which results in unused parameters in the model. I don't see evidence of this layer in the related paper, so maybe it should be removed?