Describe the bug
I have a remote LLM that is our internal proxy for Azure OpenAI and Google Gemini. I have configured it properly as it does occasionally work. However, I often get an error pop up:
Error: LLM not found. Stack Trace: Error: LLM not found. at QA (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:148:644) at async O (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:259:6679)
To Reproduce
Steps to reproduce the behavior:
Setup Remote LLM (proxied Azure OpenAI GPT-4o)
Set default llm to GPT-4o
Start a chat. (it will work)
Start a new chat with that same LLM (or switch between chats)
See error.
Expected behavior
Chats should function independently so that I can switch LLMs or have multiple separate chats with the same LLM.
Barring that, exposing logs or a more informative error message would be helpful.
Screenshots
Desktop (please complete the following information):
OS: MacOS
Hardware: MacBook Pro 14-inch, 2021 (M1 Pro 10 core), 16 GB ram
Version: Sonoma 14.6.1
Additional context
Love the tool and hope to contribute when I get some free time!
Describe the bug I have a remote LLM that is our internal proxy for Azure OpenAI and Google Gemini. I have configured it properly as it does occasionally work. However, I often get an error pop up:
Error: LLM not found. Stack Trace: Error: LLM not found. at QA (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:148:644) at async O (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:259:6679)
To Reproduce Steps to reproduce the behavior:
Expected behavior Chats should function independently so that I can switch LLMs or have multiple separate chats with the same LLM.
Barring that, exposing logs or a more informative error message would be helpful.
Screenshots
Desktop (please complete the following information):
Additional context Love the tool and hope to contribute when I get some free time!