Rayhane-mamah / Tacotron-2

DeepMind's Tacotron-2 Tensorflow implementation
MIT License
2.27k stars 905 forks source link

How to freeze encoder and pretrain decoder? #244

Open cnlinxi opened 6 years ago

cnlinxi commented 6 years ago

Hi, I want to freeze encoder and pretrain decoder, I get some errors when I add encoder_outputs_stop=tf.stop_gradient(encoder_outputs) in tacotron/models/tacotron.py. Error: Cannot convert a partially known TensorShape to a Tensor:(?,?) How to fix it? Thanks a lot

Clouxie commented 6 years ago

Hey, have you solved this problem ? Can u share with me your solution ?

cnlinxi commented 6 years ago

Hey, have you solved this problem ? Can u share with me your solution ?

No. My teacher asked me to study something else. However, I'm still interested in this. Look forward to your solution. I'm sorry for the late reply.

Clouxie commented 6 years ago

Yup , as Rayhane told me, you should add new line "L123" in tacotron.py
encoder_outputs = tf.stop_gradient(encoder_outputs) and also change placeholders (None) to hparams.tacotron_batch_size. However it doesn't work for me. I've got . ValueError: Tried to convert 'input' to a tensor and failed. Error: None values not supported. Could you please re'open this task?

cnlinxi commented 5 years ago

Hi, @Clouxie I re-open this task, my implementaion is https://github.com/cnlinxi/tacotron2decoder. However, I get bad alignment when I fine-tune whole model(25k steps). Sad... I work to fix it. BTW, Do you know why I cannot achieve good alignment? fine-tune on 0.65h datasets, is it too short?

Clouxie commented 5 years ago

Yep, you should watch this one : https://github.com/Rayhane-mamah/Tacotron-2/issues/244 In my opinion 0.65h is too short, I could get aligment only with 2h oh data and more. The only good results I've noticed was near 1-2k steps. Even tho, I'm loosing too much language knowledge of model. I can't synthesize a bit longer audio, only short sentences. Please text me if u notice something interesting. I will investigate this more in my free time.