BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.68k stars 1.48k forks source link

[Bug]: Google Gemini fails - response already started #2054

Closed SethBurkart123 closed 7 months ago

SethBurkart123 commented 7 months ago

What happened?

I'm trying to get gemini pro working with my litellm setup. I'm kinda new and just using it as a proxy with a .yaml configuration. But for some reason when calling the API with streaming etc. It fails... I'm not quite sure what it is, but I'm happy to provide more information if requested.

Relevant log output

Traceback (most recent call last):
  File "/home/seth/.local/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 2020, in async_data_generator
    async for chunk in response:
TypeError: 'async for' requires an object with __aiter__ method, got GenerateContentResponse
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/seth/.local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/seth/.local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/routing.py", line 69, in app
    await response(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "/home/seth/.local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/responses.py", line 262, in stream_response
    async for chunk in self.body_iterator:
  File "/home/seth/.local/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 2060, in async_data_generator
    raise ProxyException(
litellm.proxy.proxy_server.ProxyException

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/seth/.local/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/seth/.local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/fastapi/applications.py", line 1106, in __call__
    await super().__call__(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in __call__
    await self.app(scope, receive, send)
  File "/home/seth/.local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 83, in __call__
    raise RuntimeError(msg) from exc
RuntimeError: Caught handled exception, but response already started.

Twitter / LinkedIn details

No response

krrishdholakia commented 7 months ago

Hey @SethBurkart123, is this with vertex ai or google studio?

SethBurkart123 commented 7 months ago

Hey @SethBurkart123, is this with vertex ai or google studio?

This is using google studio.

ishaan-jaff commented 6 months ago

Hi @SethBurkart123 Can we get on a quick call to get your feedback on LiteLLM Proxy + how we can improve for you. Sharing a link to my cal for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat?month=2024-03