BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.73k stars 1.48k forks source link

[Bug]: Nextcloud -> LiteLLM -> Ollama -> async_generator object is not iterable #3357

Closed gitwittidbit closed 2 months ago

gitwittidbit commented 5 months ago

What happened?

It appears a bug happened - but it may well be attributable to my incompetence...

I am using this in my docker-compose.yml

litellm: container_name: litellm image: litellm/litellm:v1.35.31 volumes:

And this is my .env:

OLLAMA_BASE_URL='http://localhost:11434'

SCARF_NO_ANALYTICS=true DO_NOT_TRACK=true

LITELLM_LOCAL_MODEL_COST_MAP="True"

And this is my litellm-config.yaml:

model_list:

litellm_settings:

drop_params: True max_budget: 100 budget_duration: 30d

cache: True # set cache responses to True, litellm defaults to using a redis cache cache_params: # cache_params are optional type: "local" # The type of cache to initialize. Can be "local" or "redis". Defaults to "local". supported_call_types: ["acompletion", "completion", "embedding", "aembedding"] # defaults to all litellm call types

general_settings: master_key: sk-1234

Relevant log output

INFO:     192.168.XXX.XXX:32892 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/fastapi/encoders.py", line 230, in jsonable_encoder
    data = dict(obj)
           ^^^^^^^^^
TypeError: 'async_generator' object is not iterable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/fastapi/encoders.py", line 235, in jsonable_encoder
    data = vars(obj)
           ^^^^^^^^^
TypeError: vars() argument must have __dict__ attribute

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 289, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in __call__
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 291, in app
    content = await serialize_response(
              ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 179, in serialize_response
    return jsonable_encoder(response_content)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/encoders.py", line 238, in jsonable_encoder
    raise ValueError(errors) from e
ValueError: [TypeError("'async_generator' object is not iterable"), TypeError('vars() argument must have __dict__ attribute')]

Twitter / LinkedIn details

No response

gitwittidbit commented 5 months ago

Seems the jury is still out on whether this is a bug...

edwinjosegeorge commented 4 months ago

It seems this error is similar to https://github.com/BerriAI/litellm/issues/3820 @gitwittidbit can you verify if this issue is still occurring? else we can close this issue