idiap / coqui-ai-TTS

🐸💬 - a deep learning toolkit for Text-to-Speech, battle-tested in research and production
https://coqui-tts.readthedocs.io
Mozilla Public License 2.0
386 stars 35 forks source link

Fix missing param on transformers 4.4 #59

Closed gravityrail closed 2 months ago

gravityrail commented 2 months ago

This was necessary for me on MacOS, not sure if others need it.

pseudotensor commented 2 months ago

I still get this error off your head of dev

Traceback (most recent call last):
  File "/home/jon/h2ogpt/src/tts_coqui.py", line 114, in get_voice_streaming
    for i, chunk in enumerate(chunks):
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 35, in generator_context
    response = gen.send(None)
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/TTS/tts/models/xtts.py", line 658, in inference_stream
    gpt_generator = self.gpt.get_generator(
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/TTS/tts/layers/xtts/gpt.py", line 602, in get_generator
    return self.gpt_inference.generate_stream(
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/TTS/tts/layers/xtts/stream_generator.py", line 179, in generate
    model_kwargs["attention_mask"] = self._prepare_attention_mask_for_generation(
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/transformers/generation/utils.py", line 498, in _prepare_attention_mask_for_generation
    torch.isin(elements=inputs, test_elements=pad_token_id).any()
TypeError: isin() received an invalid combination of arguments - got (test_elements=int, elements=Tensor, ), but expected one of:
 * (Tensor elements, Tensor test_elements, *, bool assume_unique, bool invert, Tensor out)
 * (Number element, Tensor test_elements, *, bool assume_unique, bool invert, Tensor out)
 * (Tensor elements, Number test_element, *, bool assume_unique, bool invert, Tensor out)

I had a patch that worked for transformers 4.42.3:

https://github.com/h2oai/h2ogpt/blob/52923ac21a1532983c72b45a8e0785f6689dc770/docs/xtt.patch

But 4.43.1+ broke that further.

pseudotensor commented 2 months ago

I had to patch transformers itself a bit to work-around my issues: https://github.com/h2oai/h2ogpt/pull/1771

Maybe affects caching behavior and slows things down, unsure.