BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.2k stars 1.42k forks source link

start yaml , then get error after add line for health #3201

Open wac81 opened 4 months ago

wac81 commented 4 months ago

What happened?

get error after add line for health

model_list:

router_settings: routing_strategy: "least-busy" general_settings: background_health_checks: True # enable background health checks health_check_interval: 30 # frequency of background health checks

start litellm use litellm --config multi_llama.cpp.yaml

Relevant log output

Traceback (most recent call last):
  File "/data/miniconda3/envs/py311/lib/python3.11/site-packages/litellm/main.py", line 3790, in ahealth_check
    response = await openai_chat_completions.ahealth_check(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/miniconda3/envs/py311/lib/python3.11/site-packages/litellm/llms/openai.py", line 942, in ahealth_check
    completion = await client.completions.with_raw_response.create(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/miniconda3/envs/py311/lib/python3.11/site-packages/openai/_legacy_response.py", line 353, in wrapped
    return cast(LegacyAPIResponse[R], await func(*args, **kwargs))
                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/miniconda3/envs/py311/lib/python3.11/site-packages/openai/resources/completions.py", line 1036, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/data/miniconda3/envs/py311/lib/python3.11/site-packages/openai/_base_client.py", line 1782, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/miniconda3/envs/py311/lib/python3.11/site-packages/openai/_base_client.py", line 1485, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/data/miniconda3/envs/py311/lib/python3.11/site-packages/openai/_base_client.py", line 1576, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: File Not Found

Twitter / LinkedIn details

No response

wac81 commented 4 months ago

LiteLLM: Current Version = 1.35.4