Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
27.4k stars 2.75k forks source link

[FEAT]: Allow Azure OpenAI defined as Workspace LLM Provider #1166

Open tsaibing opened 7 months ago

tsaibing commented 7 months ago

How are you running AnythingLLM?

All versions

What happened?

Azure OpenAI can only be setup in system default LLM Preference.

But I want to specify it for a single workspace and realize that it is not available in the drop down list.

Issue found in both Windows Desktop v1.5.0 and Docker in Linux.

Are there known steps to reproduce?

  1. Create a Workspace
  2. Go to Chat Settings - Workspace LLM Provider
  3. There is no option for 'Azure OpenAI'
ShadowArcanist commented 7 months ago

AnythingLLM macOS version v1.5.0 also don't have an option 'Azure OpenAI' on the LLM provider dropdown on Workspace settings

timothycarambat commented 7 months ago

This is actually intentional for Workspace LLMs at this time. https://github.com/Mintplex-Labs/anything-llm/blob/bf165f2ab25b873581edcacac934ebff3f429d8d/frontend/src/pages/WorkspaceSettings/ChatSettings/WorkspaceLLMSelection/index.jsx#L8

spiveym commented 2 months ago

This is actually intentional for Workspace LLMs at this time.

https://github.com/Mintplex-Labs/anything-llm/blob/bf165f2ab25b873581edcacac934ebff3f429d8d/frontend/src/pages/WorkspaceSettings/ChatSettings/WorkspaceLLMSelection/index.jsx#L8

but what's the reasoning?

timothycarambat commented 2 months ago

@spiveym It was because we use the process.env for keeping track of the model preference and the dropdown for the model selector is, well, a dropdown - so we can't use /models to enum the available models.

We need a combobox component that we can use for free-form + available models so we can support providers with many models at one time specifically those who dont support /models