lm-sys / RouteLLM

A framework for serving and evaluating LLM routers - save LLM costs without compromising quality!
Apache License 2.0
2.78k stars 204 forks source link

OpenAI dependency #19

Open DanielChico opened 1 month ago

DanielChico commented 1 month ago

Hello, I am facing this issue: File "/app/app/src/router/__init__.py", line 5, in <module> from .gateway.router import router as gateway_router File "/app/app/src/router/gateway/router.py", line 4, in <module> from routellm.controller import Controller File "/usr/local/lib/python3.12/site-packages/routellm/controller.py", line 10, in <module> from routellm.routers.routers import ROUTER_CLS File "/usr/local/lib/python3.12/site-packages/routellm/routers/routers.py", line 17, in <module> from routellm.routers.matrix_factorization.model import MODEL_IDS, MFModel File "/usr/local/lib/python3.12/site-packages/routellm/routers/matrix_factorization/model.py", line 4, in <module> from routellm.routers.similarity_weighted.utils import OPENAI_CLIENT File "/usr/local/lib/python3.12/site-packages/routellm/routers/similarity_weighted/utils.py", line 11, in <module> OPENAI_CLIENT = OpenAI() ^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/openai/_client.py", line 104, in __init__ raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable Of course, I can set the environment variable, but I don't want to depend on it. It would be preferable to have a way to set the base_url, model, and api_key for this client as well.

iojw commented 1 month ago

Thank you for raising this! We are aware of this and are actively looking into it. 2 points:

  1. Currently, OpenAI's client is required for generating embeddings for both the mf and sw_ranking routers, but not the classifiers. So, if you use the bert router for example and not any OpenAI models, you can just set the OpenAI key to a dummy value as a temporary workaround. I'll work on fixing this so that you don't need to do this if you don't require embeddings.

  2. The above is just for generating embeddings for mf and sw_ranking. You can set the base URL and API key for the actual LLM calls in the controller or server config.

I understand this can be confusing, will try to make this clearer. Let me know if you have any questions!