TensorSpeech / TensorFlowTTS

:stuck_out_tongue_closed_eyes: TensorFlowTTS: Real-Time State-of-the-art Speech Synthesis for Tensorflow 2 (supported including English, French, Korean, Chinese, German and Easy to adapt for other languages)
https://tensorspeech.github.io/TensorFlowTTS/
Apache License 2.0
3.85k stars 815 forks source link

TensorFlowTTS_FastSpeech_with_TFLite.ipynb errors out #435

Closed sayakpaul closed 3 years ago

sayakpaul commented 3 years ago

Hi folks.

I am working to make the TensorFlowTTS_FastSpeech_with_TFLite.ipynb notebook fully runnable on Colab. After successfully converting the Fastspeech2 model to TensorFlow Lite I am unable to run inference and here's what I am getting -

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-46-aa345b7d1ff7> in <module>()
      1 input_text = "Recent research at Harvard has shown meditatingfor as little as 8 weeks, can actually increase the grey matter in the parts of the brain responsible for emotional regulation, and learning."
      2 
----> 3 decoder_output_tflite, mel_output_tflite = infer(input_text)
      4 audio_before_tflite = melgan(decoder_output_tflite)[0, :, 0]
      5 audio_after_tflite = melgan(mel_output_tflite)[0, :, 0]

1 frames
<ipython-input-45-6f13c4de7810> in infer(input_text)
     43     interpreter.set_tensor(detail['index'], input_data[i])
     44 
---> 45   interpreter.invoke()
     46 
     47   # The function `get_tensor()` returns a copy of the tensor data.

/usr/local/lib/python3.6/dist-packages/tensorflow/lite/python/interpreter.py in invoke(self)
    538     """
    539     self._ensure_safe()
--> 540     self._interpreter.Invoke()
    541 
    542   def reset_all_variables(self):

RuntimeError: tensorflow/lite/kernels/reshape.cc:58 stretch_dim != -1 (0 != -1)Node number 83 (RESHAPE) failed to prepare.

I am using TensorFlow 2.4.0 (have tested with nightly too, it doesn't help). Here's my Colab Notebook for reproducibility.

Tagging in @jaeyoo for visibility.

dathudeptrai commented 3 years ago

@sayakpaul tf 2.3.1 is fine :)))

dathudeptrai commented 3 years ago

@abattery @thaink can you guys help? many thanks :D

sayakpaul commented 3 years ago

@dathudeptrai I used TensorFlow 2.4.0 since it's mentioned in @jaeyoo's initial notebook too and it's currently the most stable version.

abattery commented 3 years ago

Here's my Colab Notebook for reproducibility.

The above colab notebook works fine.

dathudeptrai commented 3 years ago

Here's my Colab Notebook for reproducibility.

The above colab notebook works fine.

I just explicit force tf-gpu==2.3.1 so the notebook works fine :D.

sayakpaul commented 3 years ago

I just tried TensorFlow 2.3.1 and the inference results have gotten messed up now. Here's the updated Colab Notebook. Anything I am missing out on?

abattery commented 3 years ago

@dathudeptrai I've tested it before tf-gpu==2.3.1 is added. The colab notebook was successfully ran with the tf-nightly version. Now, the tf-gpu==2.3.1 requirement prevents the usage of tensorflow 2.4.0 or tensorflow nightly since the TensorFlowTTS project explicitly depends on the certain version.

sayakpaul commented 3 years ago

@abattery possible for you to take my initial Colab Notebook and suggest any changes to run the TFLite model successfully?

abattery commented 3 years ago

I did not change anythings. In my side, it just worked with tf-nightly.

sayakpaul commented 3 years ago

Let me try then.

sayakpaul commented 3 years ago

@abattery with tf-nightly (2.5.0-dev20201221) the inference results get messed up. Here's my Colab Notebook - https://colab.research.google.com/gist/sayakpaul/91013bd6c48af59db8b63cdc50822fe3/tensorflowtts-fastspeech-with-tflite.ipynb.

Can you suggest further?

dathudeptrai commented 3 years ago

@abattery with tf-nightly (2.5.0-dev20201221) the inference results get messed up. Here's my Colab Notebook - https://colab.research.google.com/gist/sayakpaul/91013bd6c48af59db8b63cdc50822fe3/tensorflowtts-fastspeech-with-tflite.ipynb.

Can you suggest further?

you forgot load weight for fastspeech2. About tflite nightly, everything ok, there is no bug in tensorflowtts side and tflite side.

fastspeech._build()
fastspeech.load_weights("/content/fastspeech2-generator-1500000.h5")
sayakpaul commented 3 years ago

Really sorry about the pesky bug. Thanks for pointing it out. It all now runs end-to-end inside Colab (updated Colab Notebook).

@dathudeptrai would you like me to submit a PR including this Colab Notebook? This might be helpful for the community in case they are interested to run it entirely on Colab.

dathudeptrai commented 3 years ago

@sayakpaul the repo includes all colab and seems you did not realize that :D. (https://colab.research.google.com/drive/1akxtrLZHKuMiQup00tzO2olCaN-y3KiD?usp=sharing, https://colab.research.google.com/drive/1HudLLpT9CQdh2k04c06bHUwLubhGTWxA?usp=sharing, https://colab.research.google.com/drive/1YpSHRBRPBI7cnTkQn1UcVTWEQVbsUm1S?usp=sharing, https://colab.research.google.com/drive/1ybWwOS5tipgPFttNulp77P6DAB5MtiuN?usp=sharing)

sayakpaul commented 3 years ago

@dathudeptrai I am aware. Probably I failed to convey. Let me retry.

The notebook TensorFlowTTS_FastSpeech_with_TFLite.ipynb (located inside the notebooks directory) does not run end-to-end on Colab. It will error out in its current form. Hence, I was asking as I already created a variant of the notebook that runs fully on Colab (without any errors) should I create a PR including my notebook.

Let me know if anything is unclear.

sayakpaul commented 3 years ago

@dathudeptrai a gentle ping in case you missed my previous comment.

dathudeptrai commented 3 years ago

@dathudeptrai a gentle ping in case you missed my previous comment.

let create a PR :D.

sayakpaul commented 3 years ago

Thanks for the go-ahead, I will proceed @dathudeptrai :)