TensorSpeech / TensorFlowTTS

:stuck_out_tongue_closed_eyes: TensorFlowTTS: Real-Time State-of-the-art Speech Synthesis for Tensorflow 2 (supported including English, French, Korean, Chinese, German and Easy to adapt for other languages)
https://tensorspeech.github.io/TensorFlowTTS/
Apache License 2.0
3.81k stars 811 forks source link

Inference error in colab on GPU run-time #763

Closed MyTrueSelf closed 2 years ago

MyTrueSelf commented 2 years ago

hello,

I just want to find why there is something wrong.

In the colab environment, your example worked well when the runtime environment was set to 'None'.

but when the runtime environment set to 'GPU', i encounter the problem.

I tried changing cuda version and installing cuDnn through conda, but it didn't work well

(at the same time, AutoConfig function form tensorflow_tts.inference worked well)

how can i solve it? please help me.

image

p.s. If it solved, I'll try train with colab gpu. how can i train and fine tuning model with your code?

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.