reorproject / reor

Private & local AI personal knowledge management app.
https://reorproject.org
GNU Affero General Public License v3.0
6.98k stars 429 forks source link

Remote LLM only supporting one chat. #436

Open Jflick58 opened 4 days ago

Jflick58 commented 4 days ago

Describe the bug I have a remote LLM that is our internal proxy for Azure OpenAI and Google Gemini. I have configured it properly as it does occasionally work. However, I often get an error pop up:

Error: LLM not found. Stack Trace: Error: LLM not found. at QA (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:148:644) at async O (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:259:6679)

To Reproduce Steps to reproduce the behavior:

  1. Setup Remote LLM (proxied Azure OpenAI GPT-4o)
  2. Set default llm to GPT-4o
  3. Start a chat. (it will work)
  4. Start a new chat with that same LLM (or switch between chats)
  5. See error.

Expected behavior Chats should function independently so that I can switch LLMs or have multiple separate chats with the same LLM.

Barring that, exposing logs or a more informative error message would be helpful.

Screenshots

Desktop (please complete the following information):

Additional context Love the tool and hope to contribute when I get some free time!

Jflick58 commented 3 days ago

I did some additional investigation: Running multiple chats is not a problem in v.0.2.19.