Closed Young-Sun closed 6 years ago
Hi, the mlp_attention
is part of the multi-head attention. dot_attention
and mlp_attention
are two different methods to compute attention weights in multi-head attention. The demo samples are trained using the default settings.
Sorry for mis-understand. Yes, both dot_attention and mlp_attenton are in multi-head attention. I see, the audio samples synthesized using mlp_attention. Thanks for your fast and kind reply. :-)
Hi, again. The code is set to use mlp_attention, now. Did the uploaded audio demo samples use mlp_attention? Have you ever experimented using multi head attention? Do you have any audio samples?
Thanks.