acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
491 stars 56 forks source link

Unexpected error during intent recognition when using In Context Learning Examples #103

Closed v1-valux closed 2 months ago

v1-valux commented 3 months ago

Assist throws an Error when activating ICL Examples (even with Home Model v3):

Logger: homeassistant.components.assist_pipeline.pipeline
Quelle: components/assist_pipeline/pipeline.py:993
Integration: Assist pipeline ([Dokumentation](https://www.home-assistant.io/integrations/assist_pipeline), [Probleme](https://github.com/home-assistant/core/issues?q=is%3Aissue+is%3Aopen+label%3A%22integration%3A+assist_pipeline%22))
Erstmals aufgetreten: 22:39:13 (5 Vorkommnisse)
Zuletzt protokolliert: 22:45:03

Unexpected error during intent recognition

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 993, in recognize_intent
    conversation_result = await conversation.async_converse(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/conversation/__init__.py", line 544, in async_converse
    result = await agent.async_process(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 284, in async_process
    message = self._generate_system_prompt(raw_prompt)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 542, in _generate_system_prompt
    render_variables["response_examples"] = "\n".join(icl_example_generator(4, list(entities_to_expose.keys()), all_service_names))
                                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 475, in icl_example_generator
    chosen_example = selected_in_context_examples.pop()
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
IndexError: pop from empty list

I don't know whats wrong because I recall it has worked a few days (incl. HA updates) before. Might something have changed within HomeAssistant Updates maybe?

Hovewer I deleted all entries and reinstalled the integration, but it no longer works when "Use ICL Examples" is checked. I'm trying to narrow down the problem, but got no success so far.

acon96 commented 3 months ago

I think that it is filtering out all of the examples because it tries to select examples that match up with the devices you have exposed to the model. I added a check to make sure that doesn't happen.

Can you try installing the develop version of the integration from HACS?

v1-valux commented 2 months ago

Thanks for the quick response!

My first guesses were:

But after reinstalling yesterday the problem persisted, so I could rule those out..

Are special chars even a problem for the prompt encoding at addon side (eg. friendly_names of exposed devices)? Because the problem could also have started when I exposed some new devices to test with. (still not yet verified which characters are included though)

I'm still trying to find out how to change the branch within hacs without installing the package manually. any hints appreciated :)

EDIT: nvm I had the experimental features checkbox of HACS enabled, to make use of automatic HACS updates, though the drop-down to choose the branch does not show. I switched experimental features off and now it is there.

Now testing..

BR

v1-valux commented 2 months ago

weird.. No error ar first (It seemed having problems switching lights on or off, despite the model saying so)..

and then after pasting my in_context_examples_ger.csv file into the integration directory and editing the integration settings accordingly it reappeared again.

switching to the original in_context_examples.csv in settings didn't help either.. I then noticed, that the develop version of the file has fewer examples (garage_door and blinds merged to covers?) so I changed my translated file, but now it gets even weirder for me..

the original problem now persists (even if I reinstall the integration completely and restart HA).. (original/unchanged integration folder)

I don't think it has to do with the csv file in particular but unless I uncheck use ICL Examples I'm not able to workaround the problem.

I run latest HAOS and latest Core versions. (I was using home-llm v0.2.10 when the problem appeared) Still not sure whats happening here, is there any way to debug this?

acon96 commented 2 months ago

I run latest HAOS and latest Core versions. (I was using home-llm v0.2.10 when the problem appeared) Still not sure whats happening here, is there any way to debug this?

You can enable debug logging for the component by adding this to your configuration.yml

logger:
  default: info
  logs:
    custom_components.llama_conversation: debug

I then noticed, that the develop version of the file has fewer examples (garage_door and blinds merged to covers?) so I changed my translated file, but now it gets even weirder for me..

For that, the newer file is correct if you want to translate that one. The garage_door and blinds device types don't actually exist in Home Assistant, they were a mistake from a few months ago.

v1-valux commented 2 months ago

Thanks, I've configured the logger accordingly, but unfortunately I don't get any log messages by llama_conversation while encountering the problem. :(

..only the original one by assist_pipeline:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 993, in recognize_intent
    conversation_result = await conversation.async_converse(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/conversation/__init__.py", line 544, in async_converse
    result = await agent.async_process(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 284, in async_process
    message = self._generate_system_prompt(raw_prompt)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 542, in _generate_system_prompt
    render_variables["response_examples"] = "\n".join(icl_example_generator(4, list(entities_to_expose.keys()), all_service_names))
                                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 475, in icl_example_generator
    chosen_example = selected_in_context_examples.pop()
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
IndexError: pop from empty list
acon96 commented 2 months ago

There's nothing even if you view or download the full logs?

v1-valux commented 2 months ago

Thank you, I was not aware that the "normal" log doesn't seem to show debug logs..

In fact when I show the full log, there is something right above the issue, but only seems to show some data recognised from my devices:

2024-04-14 15:49:28.821 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.821 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 161
2024-04-14 15:49:28.821 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.821 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 161
2024-04-14 15:49:28.822 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.822 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 161
2024-04-14 15:49:28.822 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.822 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 184
2024-04-14 15:49:28.822 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.822 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 184
2024-04-14 15:49:28.822 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 184
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 184
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 138
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 138
2024-04-14 15:49:28.823 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 138
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 138
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 138
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] rgb_color = (255, 233, 215)
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] brightness = 184
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] temperature = 23.0
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] temperature = 24.0
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] temperature = 23.0
2024-04-14 15:49:28.824 DEBUG (MainThread) [custom_components.llama_conversation] temperature = 24.0
2024-04-14 15:49:28.825 DEBUG (MainThread) [custom_components.llama_conversation] temperature = 21.0
2024-04-14 15:49:28.826 ERROR (MainThread) [homeassistant.components.assist_pipeline.pipeline] Unexpected error during intent recognition

Best Regards

acon96 commented 2 months ago

Hmm the attributes being None shouldn't have caused them to be filtered out. Can you make sure you're on the latest version and try again? I added some more debug logging in the latest version + another workaround that might fix things.

v1-valux commented 2 months ago

Okay I updated to v0.2.12 and did not experience the error again, promising so far. :)

Although even if I set the number of ICL examples to anything above - it would now only generate 3 examples, as seen in the debug logs:

Attempted to generate 5 ICL examples for conversation, but only 3 are available!
Respond to the following user instruction by responding in the same format as the following examples:
{"to_say": "Licht umgeschaltet.", "service": "light.toggle", "target_device": "light.yeelight_color_05"}
{"to_say": "Licht eingeschaltet.", "service": "light.turn_on", "target_device": "light.yeelight_color_05"}
{"to_say": "Licht ausgeschaltet.", "service": "light.turn_off", "target_device": "light.yeelight_color_05"}
User Instruction:Test [/INST] 

(standard prompt)

And the extension throws a few more logs related to the stop token aswell everytime I prompt it:

Model response did not end on a stop token (unfinished sentence)

But the ICL feature itself seems to be working again for me at least. :) Thanks!

acon96 commented 2 months ago

Glad it's working again.

And that warning is expected. If you only have lights exposed to the model, then it will only pick examples related to lights and there's only 3 light examples by default (one for each service) so if you try and generate 5 examples then it runs out.

v1-valux commented 2 months ago

You're right, at first I thought I certainly added a bunch of switches aswell, but realized there are indeed only input_booleans, lights and climates from which only lights are working by home-llm for now.. :)

Anyways cheers - Keep up that great work! :+1: