BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.54k stars 1.19k forks source link

[Bug]: crash when select model on demo website #410

Closed okisdev closed 10 months ago

okisdev commented 10 months ago

What happened?

Please see video

https://github.com/BerriAI/litellm/assets/66008528/069cfecb-5b88-4d57-8318-5bce8390a483

Relevant log output

No response

Twitter / LinkedIn details

@okisdev

krrishdholakia commented 10 months ago

hi @okisdev we were planning on deprecating it. How did you come across it? Any specific model you were trying to use?

okisdev commented 10 months ago

Hi @krrishdholakia. Just browsing this project cos my colleague mentioned.

krrishdholakia commented 10 months ago

If you're looking for a quick way to explore the package via code, here's a snippet that should work

!pip install litellm
from litellm import completion

## set ENV variables
os.environ["OPENAI_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)
krrishdholakia commented 10 months ago

We've had a couple people mention the playground issue - so i think it might be worth revisiting how to bring it back in an easier to maintain fashion

cc: @ishaan-jaff