as-ideas / TransformerTTS

🤖💬 Transformer TTS: Implementation of a non-autoregressive Transformer based neural network for text to speech.
https://as-ideas.github.io/TransformerTTS/
Other
1.13k stars 227 forks source link

Why use dropout in decoder prenet also in inference? #75

Open iclementine opened 4 years ago

iclementine commented 4 years ago

I've noticed a detail in decoder prenet that, it also uses dropout at inference. There is a comment saying that "# use dropout also in inference for positional encoding relevance". I've also tried disabling dropout in inference but the generated audio is a mess. Is there a more detailed explanation for this?

Thank you!

https://github.com/as-ideas/TransformerTTS/blob/e4ded5bf5a488aab98ce6aee981e3ac0946f4ddc/model/layers.py#L397

cfrancesco commented 3 years ago

Hi, this is taken from the tacotron paper. I believe it helps with "highlighting" the position information for the autoregressive predictions.