acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
490 stars 56 forks source link

HA yaml config (how to customize prompt ?) #92

Closed PheelTi closed 3 months ago

PheelTi commented 3 months ago

Hi, very nice project ! Would it be possible to configure agents inside configuration.yaml ? for instance :

llama_conversation:
  - name: agent1
    model_backend: ollama
    host: ollama-host
    port: 11434
    huggingface_model: "mistral-7b:latest"
    prompt: > 
      This is my own prompt
      available services:  {{ services }}
      available devices:
      {{ devices }}
  - name: agent2
    model_backend: ollama
    host: ollama-host
    port: 11434
    huggingface_model: "mixtral-8x7b:latest"
    prompt: > 
      This is another prompt
      available services:  {{ services }}
      available devices:
      {{ devices }}

and having any additional config options defined with CONF_* like temperature etc.

I tried several things but HA keeps telling me : The llama_conversation integration does not support YAML setup, please remove it from your configuration file

subsidiary question : how do I customize prompt in HA in the current state (without forking or modifying source code) ?

acon96 commented 3 months ago

I only implemented ConfgFlow for the integration since that was how the example was set up. It needs to also support the CONFIG_SCHEMA/PLATFORM_SCHEMA method: https://developers.home-assistant.io/docs/configuration_yaml_index

I'll add this to the TODO list.

PheelTi commented 3 months ago

Okay, Python is not my main programming language, and I don't have the knowledge of how HA plugin works, but as it is well documented, I may try a PR when I will have little time.

acon96 commented 3 months ago

I did some more digging and unfortunately it is not possible to configure this via YAML only. In order to set up a conversation agent, you need to have a Config Entry which can't be created using configuration.yaml