Recent versions of LocalAI return the object type chat.completion.chunk as well as chat.completion.
The component only checks for chat.completion.
This result in the following exception:
File "/config/custom_components/llama_conversation/agent.py", line 262, in async_process
response = await self._async_generate(conversation)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/llama_conversation/agent.py", line 187, in _async_generate
return await self.hass.async_add_executor_job(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/llama_conversation/agent.py", line 907, in _generate
return self._extract_response(result.json())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/llama_conversation/agent.py", line 862, in _extract_response
return choices[0]["text"]
~~~~~~~~~~^^^^^^^^
KeyError: 'text'
This PR introduces a check for chat.completion.chunk as well.
Recent versions of LocalAI return the object type
chat.completion.chunk
as well aschat.completion
. The component only checks forchat.completion
. This result in the following exception:This PR introduces a check for
chat.completion.chunk
as well.