edwko / OuteTTS

Interface for OuteTTS models.
Apache License 2.0
663 stars 46 forks source link

Error While running Inference #8

Closed RakshitAralimatti closed 3 weeks ago

RakshitAralimatti commented 3 weeks ago

Getting the following error while running the inference using the Raw and GGUF model

---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
[<ipython-input-5-cc71416b4720>](https://localhost:8080/#) in <cell line: 7>()
      5 
      6 # Or initialize the interface with a GGUF model
----> 7 interface = InterfaceGGUF("/content/OuteTTS-0.1-350M-Q4_K_M.gguf")
      8 
      9 # Generate TTS output

5 frames
[/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils_fast.py](https://localhost:8080/#) in __init__(self, *args, **kwargs)
    113         elif fast_tokenizer_file is not None and not from_slow:
    114             # We have a serialization from tokenizers which let us directly build the backend
--> 115             fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
    116         elif slow_tokenizer is not None:
    117             # We need to convert a slow tokenizer to build the backend

Exception: data did not match any variant of untagged enum ModelWrapper at line 352271 column 3
edwko commented 3 weeks ago

This issue occurs when using an older version of the transformers library. To resolve it, please update to the latest version by running:

pip install transformers --upgrade

I've also updated the requirements.txt file to specify transformers>=4.46.1 and published an updated package on PyPI to ensure compatibility.