lm-sys / RouteLLM

A framework for serving and evaluating LLM routers - save LLM costs without compromising quality!
Apache License 2.0
2.91k stars 222 forks source link

OpenAIError: The api_key client option must be set #38

Open DmitriyG228 opened 1 month ago

DmitriyG228 commented 1 month ago

while running basic example I get this error

` import os from routellm.controller import Controller

os.environ["OPENAI_API_KEY"] = 'my api'

client = Controller( routers=["mf"], strong_model="gpt-4o", weak_model="gpt-4o-mini", ) `

'--------------------------------------------------------------------------- OpenAIError Traceback (most recent call last) Cell In[1], line 2 1 import os ----> 2 from routellm.controller import Controller 4 os.environ["OPENAI_API_KEY"] = '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' 7 client = Controller( 8 routers=["mf"], 9 strong_model="gpt-4o", 10 weak_model="gpt-4o-mini", 11 )

File ~/playground/agent_2906/0507/RouteLLM/routellm/controller.py:10 7 from litellm import acompletion, completion 8 from tqdm import tqdm ---> 10 from routellm.routers.routers import ROUTER_CLS 12 # Default config for routers augmented using golden label data from GPT-4. 13 # This is exactly the same as config.example.yaml. 14 GPT_4_AUGMENTED_CONFIG = { 15 "sw_ranking": { 16 "arena_battle_datasets": [ (...) 27 "mf": {"checkpoint_path": "routellm/mf_gpt4_augmented"}, 28 }

File ~/playground/agent_2906/0507/RouteLLM/routellm/routers/routers.py:17 12 from routellm.routers.causal_llm.llm_utils import ( 13 load_prompt_format, 14 to_openai_api_messages, 15 ) 16 from routellm.routers.causal_llm.model import CausalLLMClassifier ---> 17 from routellm.routers.matrix_factorization.model import MODEL_IDS, MFModel 18 from routellm.routers.similarity_weighted.utils import ( 19 OPENAI_CLIENT, 20 compute_elo_mle_with_tie, 21 compute_tiers, 22 preprocess_battles, 23 ) 26 def no_parallel(cls):

File ~/playground/agent_2906/0507/RouteLLM/routellm/routers/matrix_factorization/model.py:4 1 import torch 2 from huggingface_hub import PyTorchModelHubMixin ----> 4 from routellm.routers.similarity_weighted.utils import OPENAI_CLIENT 6 MODEL_IDS = { 7 "RWKV-4-Raven-14B": 0, 8 "alpaca-13b": 1, (...) 70 "zephyr-7b-beta": 63, 71 } 74 class MFModel(torch.nn.Module, PyTorchModelHubMixin):

File ~/playground/agent_2906/0507/RouteLLM/routellm/routers/similarity_weighted/utils.py:11 8 from sklearn.linear_model import LogisticRegression 10 choices = ["A", "B", "C", "D"] ---> 11 OPENAI_CLIENT = OpenAI() 14 def compute_tiers(model_ratings, num_tiers): 15 n = len(model_ratings)

File ~/anaconda3/envs/langchain/lib/python3.11/site-packages/openai/_client.py:105, in OpenAI.init(self, api_key, organization, project, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation) 103 api_key = os.environ.get("OPENAI_API_KEY") 104 if api_key is None: --> 105 raise OpenAIError( 106 "The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable" 107 ) 108 self.api_key = api_key 110 if organization is None:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable'

DmitriyG228 commented 1 month ago

I tried with clean conda environment, same result.

fengshichen commented 1 month ago

Are you using any IDEs? try config the env OPENAI_API_KEY in IDE's running config

hmoghimifam commented 1 month ago

set it to environment variable by running this in your terminal: export OPENAI_API_KEY='your_api_key_here'

iojw commented 1 month ago

Yes, please try setting the OpenAI API key before running.

lee-b commented 4 weeks ago

@DmitriyG228 , you should revoke the key that you pasted above.