BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.04k stars 1.39k forks source link

OpenAI Proxy Server - AttributeError: 'Router' object has no attribute 'model_names' using Together AI #669

Closed Rizaldy closed 10 months ago

Rizaldy commented 10 months ago

What happened?

I was trying to make the OpenAI Proxy Server on my local to work, following the documentation

  1. git clone https://github.com/BerriAI/litellm.git
  2. Modify template_secrets.toml and put TogetherAI API Token
  3. pip install litellm
  4. run litellm --model together_ai/WizardLM/WizardCoder-Python-34B-V1.0 On my terminal looks fine
    
    Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new

Docs: https://docs.litellm.ai/docs/proxy_server

LiteLLM: Test your local endpoint with: "litellm --test" [In a new terminal tab]

LiteLLM: View available endpoints for this server on: http://0.0.0.0:8000

LiteLLM: Self-host your proxy using the following: https://docs.litellm.ai/docs/proxy_server#deploy-proxy

INFO: Started server process [87766] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)


Then I tried to test with `litellm --test` on another terminal then error is like below

'Router' object has no attribute 'model_names' LiteLLM.Exception: 'Router' object has no attribute 'model_names' INFO: 127.0.0.1:59454 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error ERROR: Exception in ASGI application Traceback (most recent call last): File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/llm.py", line 138, in litellm_completion if model_router and data["model"] in model_router.get_model_names(): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/router.py", line 100, in get_model_names return self.model_names ^^^^^^^^^^^^^^^^ AttributeError: 'Router' object has no attribute 'model_names'

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/applications.py", line 1115, in call await super().call(scope, receive, send) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/applications.py", line 122, in call await self.middleware_stack(scope, receive, send) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 184, in call raise exc File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 162, in call await self.app(scope, receive, _send) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in call await self.app(scope, receive, send) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 79, in call raise exc File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 68, in call await self.app(scope, receive, sender) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in call raise e File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in call await self.app(scope, receive, send) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/routing.py", line 718, in call await route.handle(scope, receive, send) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle await self.app(scope, receive, send) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/routing.py", line 66, in app response = await func(request) ^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/routing.py", line 274, in app raw_response = await run_endpoint_function( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function return await dependant.call(values) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 515, in chat_completion return litellm_completion(data, type="chat_completion", user_model=user_model, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/backoff/_sync.py", line 105, in retry ret = target(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/llm.py", line 148, in litellm_completion handle_llm_exception(e=e, user_api_base=user_api_base) File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/llm.py", line 92, in handle_llm_exception raise UnknownLLMError from e litellm.proxy.llm.UnknownLLMError


I'm not using docker since looks like the local server is already running. Really appreciate any help to point out where the step am I missing.

Thanks!

### Relevant log output

```shell
'Router' object has no attribute 'model_names'
LiteLLM.Exception: 'Router' object has no attribute 'model_names'
INFO:     127.0.0.1:59454 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/llm.py", line 138, in litellm_completion
    if model_router and data["model"] in model_router.get_model_names():
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/router.py", line 100, in get_model_names
    return self.model_names
           ^^^^^^^^^^^^^^^^
AttributeError: 'Router' object has no attribute 'model_names'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/applications.py", line 1115, in __call__
    await super().__call__(scope, receive, send)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in __call__
    await self.app(scope, receive, send)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/routing.py", line 274, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 515, in chat_completion
    return litellm_completion(data, type="chat_completion", user_model=user_model,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/llm.py", line 148, in litellm_completion
    handle_llm_exception(e=e, user_api_base=user_api_base)
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/litellm/proxy/llm.py", line 92, in handle_llm_exception
    raise UnknownLLMError from e
litellm.proxy.llm.UnknownLLMError

Twitter / LinkedIn details

No response

Rizaldy commented 10 months ago

Tried to create new files following the documentation with this code

import openai 

openai.api_key = "any-string-here"
openai.api_base = "http://0.0.0.0:8000" # your proxy url

# call openai
response = openai.ChatCompletion.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hey"}])

print(response)

and the error is below

(litellm_env) aldi@Rizaldy-MacBook-Pro litellm % /Users/aldi/miniconda3/envs/litellm_env/bin/python /Users/aldi/Documents/Python/projects-3/litellm
/test.py
Traceback (most recent call last):
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/openai/api_requestor.py", line 413, in handle_error_response
    error_data = resp["error"]
                 ~~~~^^^^^^^^^
TypeError: string indices must be integers, not 'str'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/aldi/Documents/Python/projects-3/litellm/test.py", line 7, in <module>
    response = openai.ChatCompletion.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hey"}])
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create
    response, _, api_key = requestor.request(
                           ^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/openai/api_requestor.py", line 299, in request
    resp, got_stream = self._interpret_response(result, stream)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/openai/api_requestor.py", line 710, in _interpret_response
    self._interpret_response_line(
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/openai/api_requestor.py", line 775, in _interpret_response_line
    raise self.handle_error_response(
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/aldi/miniconda3/envs/litellm_env/lib/python3.11/site-packages/openai/api_requestor.py", line 415, in handle_error_response
    raise error.APIError(
openai.error.APIError: Invalid response object from API: 'Internal Server Error' (HTTP response code was 500)
elhombrejd commented 10 months ago

@Rizaldy the last error you reported I was having when I was passing "NONE" as the API key.

When I replaced it for "hi", it worked.

But I ended up with the same "AttributeError: 'Router' object has no attribute 'model_names'" problem, which I'm currently running into.

elhombrejd commented 10 months ago

@Rizaldy forget it. The error keeps coming while running litellm as a proxy server.

krrishdholakia commented 10 months ago

hey @elhombrejd @Rizaldy can you run pip install --upgrade litellm

This should be resolved in the latest version of litellm.

krrishdholakia commented 10 months ago

Please re-open if you still see the issue in v0.11.1

kim333 commented 10 months ago

@krrishdholakia I think am having this issue with v0.11.1 I am trying to run local ollama with litellm.

(autogen) a:other a$ pip show litellm
Name: litellm
Version: 0.11.1
(autogen) a :other a$ litellm --model ollama/mistral --api_base http://localhost:11434 --temperature 0.3 --max_tokens 2048

#------------------------------------------------------------#
#                                                            #
#              'I don't like how this works...'               #
#        https://github.com/BerriAI/litellm/issues/new        #
#                                                            #
#------------------------------------------------------------#
 Thank you for using LiteLLM! - Krrish & Ishaan

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new

Docs: https://docs.litellm.ai/docs/proxy_server

ollama called
LiteLLM: Test your local endpoint with: "litellm --test" [In a new terminal tab]

LiteLLM: View available endpoints for this server on: http://0.0.0.0:8000

LiteLLM: Self-host your proxy using the following: https://docs.litellm.ai/docs/proxy_server#deploy-proxy

`INFO:     Started server process [55605]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
'Router' object has no attribute 'model_names'
LiteLLM.Exception: 'Router' object has no attribute 'model_names'
INFO:     127.0.0.1:53025 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/litellm/proxy/llm.py", line 138, in litellm_completion
    if model_router and data["model"] in model_router.get_model_names():
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/litellm/router.py", line 100, in get_model_names
    return self.model_names
AttributeError: 'Router' object has no attribute 'model_names'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/fastapi/applications.py", line 1115, in __call__
    await super().__call__(scope, receive, send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in __call__
    await self.app(scope, receive, send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/fastapi/routing.py", line 274, in app
    raw_response = await run_endpoint_function(
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 515, in chat_completion
    return litellm_completion(data, type="chat_completion", user_model=user_model,
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/litellm/proxy/llm.py", line 148, in litellm_completion
    handle_llm_exception(e=e, user_api_base=user_api_base)
  File "/Users/opt/anaconda3/envs/autogen/lib/python3.10/site-packages/litellm/proxy/llm.py", line 92, in handle_llm_exception
    raise UnknownLLMError from e
litellm.proxy.llm.UnknownLLMError
krrishdholakia commented 10 months ago

hey @Rizaldy @kim333 thanks for raising this. I just pushed the fix here - https://github.com/BerriAI/litellm/commit/cfed9fff74598c43f150ad5b8b29e4cc05ca989c. It should be out in v0.12.2 in 10-15 minutes.

krrishdholakia commented 10 months ago
Screenshot 2023-10-23 at 9 31 02 AM

can confirm it works with ollama mistral

reactivetype commented 10 months ago

@krrishdholakia It looks like version 0.12.2 is not yet available in poetry nor on pypi.

kim333 commented 10 months ago

Same here for me. I am waiting the 0.12.2 release!

krrishdholakia commented 10 months ago

@reactivetype @kim333 @Rizaldy thanks for raising this. working on it now.

krrishdholakia commented 10 months ago

Fix is live in a dev release v0.12.4.dev2

We're working on a main release.

eleqtrizit commented 10 months ago

I arrived here with almost the same problem as the poster, but using local ollama. I installed 0.12.4.dev2, but the litellm server is just not answering. No CPU or GPU usage.

Normally, this would seem like a separate bug, but I'm not sure it's related specifically to the dev release or this was a problem before.

Command: litellm --model ollama/wizardcoder:13b-python --api_base http://localhost:8000

Test command fails: litellm --test --host 127.0.0.1

Side note, Control-C doesn't kill the server. I have to kill -9 it.

krrishdholakia commented 10 months ago

thanks for raising it @eleqtrizit

krrishdholakia commented 10 months ago

I'm on v0.12.4 locally and i can confirm this works:

Screenshot 2023-10-24 at 4 25 10 PM Screenshot 2023-10-24 at 4 25 06 PM