acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
631 stars 64 forks source link

ability to delelte or not use old messages (context limit) #48

Closed Anto79-ops closed 7 months ago

Anto79-ops commented 8 months ago

hey,

We think the issus of getting blank messages stems for the the fact that old conversations are not purged and so the context limit gets used up rather quick, This could explain why my request work on the first 1 or 2 times, but they then things become flakey after asking it a few questions.

Perhaps add the option to delete old messages on every new message request or only keep 1 for example?

thanks

Check out fork from @lunamidori5 here https://github.com/Anto79-ops/home-llm

acon96 commented 8 months ago

Feel free to open a pull request and I'll take a look.

I was looking into supporting things like RoPE and "self extend" to extend the size of the context. Both techniques are supported by llama.cpp along with simple sliding window truncation, and they just needs to be wired up.

Anto79-ops commented 8 months ago

If you're willing to debug a little, the commits on my fork are not working 100%, and there were no logs, so it could be in the config entry directly that is causing the issue below.

image.png

lunamidori5 commented 8 months ago

@Anto79-ops thats my jank code, I know whats wrong and will try to fix