thuhcsi / icassp2021-emotion-tts

Please visit: https://thuhcsi.github.io/icassp2021-emotion-tts/
https://thuhcsi.github.io/icassp2021-emotion-tts/
34 stars 13 forks source link

What is `atten_weights_ph`? #3

Open ngocanh2162 opened 2 years ago

ngocanh2162 commented 2 years ago

In file modules/attention.py line 434-435

        if atten_weights_ph is not None:    # used for emotional gst tts inference
            atten_weights = atten_weights_ph

When I run inference, it stucks at this tensor. I cannot find any refer to this

caixxiong commented 1 year ago

Thank you for your quesstion.

When training, the attention weights of gst tokens is computed by the prosody embedding of the reference utterance(i.e. the input utterance).

When synthesizing, the attention weights is passed by this argument: atten_weights_ph(means attention weigths placeholder), which is computed offline by averaging attention weights of the top-K utterances of each emotion.