neonbjb / tortoise-tts

A multi-voice TTS system trained with an emphasis on quality
Apache License 2.0
13.08k stars 1.8k forks source link

raise RuntimeError('Error(s) in loading state_dict #480

Open SDCalvo opened 1 year ago

SDCalvo commented 1 year ago

For some context I tried following the instalation guide, I'm using wls and a virtualenv to run the repo. After installing everything when trying to run python3 tortoise/do_tts.py --text "I'm going to speak this" --voice random --preset fast

I get this error: Traceback (most recent call last): File "path/tortoise-tts/tortoise-tts/tortoise/do_tts.py", line 27, in tts = TextToSpeech(models_dir=args.model_dir) File "path/tortoise-tts/tortoise-tts/tortoise/api.py", line 231, in init self.autoregressive.load_state_dict(torch.load(get_model_path('autoregressive.pth', models_dir))) File "path/tortoise-tts/tortoise-tts/tts-env/lib/python3.10/site-packages/torch-2.0.1-py3.10-linux-x86_64.egg/torch/nn/modules/module.py", line 2041, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for UnifiedVoice: Unexpected key(s) in state_dict: "gpt.h.0.attn.bias", "gpt.h.0.attn.masked_bias", "gpt.h.1.attn.bias", "gpt.h.1.attn.masked_bias", "gpt.h.2.attn.bias", "gpt.h.2.attn.masked_bias", "gpt.h.3.attn.bias", "gpt.h.3.attn.masked_bias", "gpt.h.4.attn.bias", "gpt.h.4.attn.masked_bias", "gpt.h.5.attn.bias", "gpt.h.5.attn.masked_bias", "gpt.h.6.attn.bias", "gpt.h.6.attn.masked_bias", "gpt.h.7.attn.bias", "gpt.h.7.attn.masked_bias", "gpt.h.8.attn.bias", "gpt.h.8.attn.masked_bias", "gpt.h.9.attn.bias", "gpt.h.9.attn.masked_bias", "gpt.h.10.attn.bias", "gpt.h.10.attn.masked_bias", "gpt.h.11.attn.bias", "gpt.h.11.attn.masked_bias", "gpt.h.12.attn.bias", "gpt.h.12.attn.masked_bias", "gpt.h.13.attn.bias", "gpt.h.13.attn.masked_bias", "gpt.h.14.attn.bias", "gpt.h.14.attn.masked_bias", "gpt.h.15.attn.bias", "gpt.h.15.attn.masked_bias", "gpt.h.16.attn.bias", "gpt.h.16.attn.masked_bias", "gpt.h.17.attn.bias", "gpt.h.17.attn.masked_bias", "gpt.h.18.attn.bias", "gpt.h.18.attn.masked_bias", "gpt.h.19.attn.bias", "gpt.h.19.attn.masked_bias", "gpt.h.20.attn.bias", "gpt.h.20.attn.masked_bias", "gpt.h.21.attn.bias", "gpt.h.21.attn.masked_bias", "gpt.h.22.attn.bias", "gpt.h.22.attn.masked_bias", "gpt.h.23.attn.bias", "gpt.h.23.attn.masked_bias", "gpt.h.24.attn.bias", "gpt.h.24.attn.masked_bias", "gpt.h.25.attn.bias", "gpt.h.25.attn.masked_bias", "gpt.h.26.attn.bias", "gpt.h.26.attn.masked_bias", "gpt.h.27.attn.bias", "gpt.h.27.attn.masked_bias", "gpt.h.28.attn.bias", "gpt.h.28.attn.masked_bias", "gpt.h.29.attn.bias", "gpt.h.29.attn.masked_bias".

TheChickenMan69 commented 1 year ago

I'm having the same problem, let me know if you find a solution and I will do the same haha!

santiagocalvoazumo commented 1 year ago

Use transformers==4.19 another Issue had the sugestion, it works perfectly now!! Also if you are runing this on wsl you need at least 12gb of ram for the virtual machine for this to work, on 8gb it kills the process.

Dash3210 commented 1 year ago

Use transformers==4.19

I have 4.19 but still giving me the same error

Afiyetolsun commented 1 year ago

Same!