BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.12k stars 1.13k forks source link

[Bug]: The latest release broke the proxy command when specifying a config #4324

Closed vanpelt closed 1 week ago

vanpelt commented 1 week ago

What happened?

Running litellm --config my-config.yaml blows up. Looks like an indentation snafu. When fixing this a test would be a good idea ;)

Relevant log output

File "/venv/bin/litellm", line 8, in <module>
    sys.exit(run_server())
             ^^^^^^^^^^^^
  File "/venv/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv/lib/python3.12/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/venv/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv/lib/python3.12/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv/lib/python3.12/site-packages/litellm/proxy/proxy_cli.py", line 498, in run_server
    if key_management_settings is not None:
       ^^^^^^^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'key_management_settings' where it is not associated with a value

Twitter / LinkedIn details

@vanpelt