edwko / OuteTTS

Interface for OuteTTS models.
Apache License 2.0
412 stars 25 forks source link

GGUF usage #15

Open silvacarl2 opened 2 weeks ago

silvacarl2 commented 2 weeks ago

with GGUF, we are getting this error:

Using fallback chat format: llama-2 Exception ignored in: <function Llama.del at 0x7f68ee54b7f0> Traceback (most recent call last): File "/home/silvacarl/.local/lib/python3.10/site-packages/llama_cpp/llama.py", line 2201, in del File "/home/silvacarl/.local/lib/python3.10/site-packages/llama_cpp/llama.py", line 2198, in close File "/usr/lib/python3.10/contextlib.py", line 584, in close File "/usr/lib/python3.10/contextlib.py", line 576, in exit File "/usr/lib/python3.10/contextlib.py", line 561, in exit File "/usr/lib/python3.10/contextlib.py", line 340, in exit File "/home/silvacarl/.local/lib/python3.10/site-packages/llama_cpp/_internals.py", line 69, in close File "/usr/lib/python3.10/contextlib.py", line 584, in close File "/usr/lib/python3.10/contextlib.py", line 576, in exit File "/usr/lib/python3.10/contextlib.py", line 561, in exit File "/usr/lib/python3.10/contextlib.py", line 449, in _exit_wrapper File "/home/silvacarl/.local/lib/python3.10/site-packages/llama_cpp/_internals.py", line 63, in free_model TypeError: 'NoneType' object is not callable

edwko commented 5 days ago

This issue seems to be related to llama-cpp-python. You might find more details here: GitHub Issue #806. Additionally, you could try this suggested solution: Issue #1610 - Comment.

silvacarl2 commented 5 days ago

thx checking that out