acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
490 stars 56 forks source link

Fix bug in Chat Completions API #114

Closed xBelladonna closed 2 months ago

xBelladonna commented 2 months ago

Recent versions of LocalAI return the object type chat.completion.chunk as well as chat.completion. The component only checks for chat.completion. This result in the following exception:

   File "/config/custom_components/llama_conversation/agent.py", line 262, in async_process                                                                                                                                
     response = await self._async_generate(conversation)                                                                                                                                                                   
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                                                                                                   
   File "/config/custom_components/llama_conversation/agent.py", line 187, in _async_generate
     return await self.hass.async_add_executor_job(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
     result = self.fn(*self.args, **self.kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/config/custom_components/llama_conversation/agent.py", line 907, in _generate
     return self._extract_response(result.json())
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/config/custom_components/llama_conversation/agent.py", line 862, in _extract_response
     return choices[0]["text"]
            ~~~~~~~~~~^^^^^^^^
 KeyError: 'text'

This PR introduces a check for chat.completion.chunk as well.

acon96 commented 2 months ago

Thanks for the contribution!