lm-sys / RouteLLM

A framework for serving and evaluating LLM routers - save LLM costs without compromising quality!
Apache License 2.0
3.3k stars 250 forks source link

insufficient_quota error #14

Closed bisratberhanu closed 3 months ago

bisratberhanu commented 4 months ago

I set up my server successfully but when I even ask a simple question server responds with the following error

INFO:     127.0.0.1:63236 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 399, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
    raise exc
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 77, in app 
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 72, in app 
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\fastapi\routing.py", line 278, in app  
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\fastapi\routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\Desktop\Bisrat files\RouteLLM\routellm\openai_server.py", line 153, in create_chat_completion    
    routed_model = route_fn(prompt, threshold, ROUTED_PAIR)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\Desktop\Bisrat files\RouteLLM\routellm\routers\routers.py", line 42, in route
    if self.calculate_strong_win_rate(prompt) >= threshold:
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\Desktop\Bisrat files\RouteLLM\routellm\routers\routers.py", line 235, in calculate_strong_win_rate
    winrate = self.model.pred_win_rate(
              ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\Desktop\Bisrat files\RouteLLM\routellm\routers\matrix_factorization\model.py", line 124, in pred_win_rate
    logits = self.forward([model_a, model_b], prompt)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\Desktop\Bisrat files\RouteLLM\routellm\routers\matrix_factorization\model.py", line 113, in forward
    OPENAI_CLIENT.embeddings.create(input=[prompt], model=self.embedding_model)
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\resources\embeddings.py", line 114, in create
    return self._post(
           ^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1261, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 942, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1026, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1074, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1026, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1074, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\Bisrat\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1041, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.RateLimitError: Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}

I dont understand how it could say quota limit exceeded before even a single response. please help me navigate this error.

iojw commented 4 months ago

Hi there! This looks like it is caused by an insufficient quota on your OpenAI API key, you need to resolve that in your OpenAI account first.

bisratberhanu commented 4 months ago

question: do we need a paid openAI account to access the gpt-4 model. if yes, can You tell me how to change my strong model to gpt 3.5 or other models (I couldn't understand the readme file instructions on how to use different models. )

iojw commented 4 months ago

Yes, you do need a paid OpenAI account to use GPT-4 - we do not serve any models directly, we only route to other providers like Anthropic or OpenAI.

You can change your strong model by adding e.g. --strong-model claude-3.5 when launching the server. I would recommend using a model on the same "tier" as GPT-4 for the strong model.

Let me know if you have any other questions.

iojw commented 3 months ago

Closing for now, let me know if you have any other questions.