Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
27.47k stars 2.77k forks source link

[FEAT]: Improve UX to let user know which LLM is actually going to be used #2642

Open kcalliauw opened 4 days ago

kcalliauw commented 4 days ago

What would you like to see?

As a user I tend to like to switch between LLMs often (this includes switching between Ollama models and OpenAI + Anthropic external models), and I currently find it very confusing / impossible to predict which of my LLMs are going to be chosen for a thread.

As far as I can tell, if I change the chat agent in my workspace, the existing threads remain unaffected, which "feels" like a bug to me. I would expect (where possible) context to be carried over to the newly chosen LLM.

Ideally, like in various apps: ChatGPT or TypingMind, the LLM can be chosen inside the chat context/thread. For example, here are a few UX experiences I find to be very beneficial:

CleanShot 2024-11-18 at 09 35 41

CleanShot 2024-11-18 at 09 36 44

Just my 2 cents. Keep up the great work!