BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.19k stars 1.42k forks source link

Command nightly not working w/ Azure client. #2159

Closed davisuga closed 2 months ago

davisuga commented 6 months ago

Logs:

INFO:     127.0.0.1:41442 - "POST /openai/deployments/command-nightly/chat/completions?api-version=2023-12-01-preview HTTP/1.1" 200 OK
Traceback (most recent call last):
  File "/home/ubuntu/.local/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 2045, in async_data_generator
    async for chunk in response:
TypeError: 'async for' requires an object with __aiter__ method, got NoneType
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/ubuntu/.local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/routing.py", line 69, in app
    await response(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "/home/ubuntu/.local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/responses.py", line 262, in stream_response
    async for chunk in self.body_iterator:
  File "/home/ubuntu/.local/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 2085, in async_data_generator
    raise ProxyException(
litellm.proxy.proxy_server.ProxyException

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/ubuntu/.local/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 428, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/ubuntu/.local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/fastapi/applications.py", line 1106, in __call__
    await super().__call__(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in __call__
    await self.app(scope, receive, send)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 83, in __call__
    raise RuntimeError(msg) from exc
RuntimeError: Caught handled exception, but response already started.

Gemini works fine. The exact same call works fine for the OpenAI-compliant endpoint.

INFO:     127.0.0.1:49088 - "POST /chat/completions HTTP/1.1" 200 OK
ishaan-jaff commented 6 months ago

so @davisuga you started the proxy server with Command nightly and it works fine when querying with OpenAI Client but fails with Azure Client

davisuga commented 6 months ago

so @davisuga you started the proxy server with Command nightly and it works fine when querying with OpenAI Client but fails with Azure Client

Exactly.

krrishdholakia commented 2 months ago

this should now be fixed @davisuga - closing for now. Please reopen/bump me if not.


Doing some price discovery - curious would you pay for LiteLLM Enterprise (prioritized features, support etc) today?