c0sogi / LLMChat

A full-stack Webui implementation of Large Language model, such as ChatGPT or LLaMA.
MIT License
245 stars 40 forks source link

/query as defaut #19

Closed Torhamilton closed 1 year ago

Torhamilton commented 1 year ago

Implement /bypass to search via LLM without hitting vector. Otherwise chats must check vector for embeddings before interacting with LLM. The purpose of this app I believe is to grant longer context memory. Being forced to add /query in front of every chat is tiresome. Embeddings can prompt LLM how to behave on chat initiation.

c0sogi commented 1 year ago

Setting vector queries as the default has the adverse effect of reducing context memory because it can fetch out-of-context data. I've set it up so that when you call a query command, that data is only used in the next answer and disappears from context memory.

Torhamilton commented 1 year ago

Yet another reason for a robust voctor db. I will test out the new update and see how it fairs. Maybe this should be admin level decision how system behaves? We should implement a switch

c0sogi commented 1 year ago

Added admin to MySQL users table status column. It is now possible to identify who is an admin in the chat and this can be changed in the admin panel. What's next?