BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.01k stars 1.66k forks source link

[Bug]: Add support for AI21 Jamba models #5475

Closed toniengelhardt closed 2 months ago

toniengelhardt commented 2 months ago

What happened?

Getting "Not found" errors for Jamba 1.5 large / mini models

This is the output with verbose=True, no error details given:

Screenshot 2024-09-02 at 16 55 10

Screenshot 2024-09-02 at 16 55 24

I also tried for model names ai21/jamba-1.5-large and ai21/jamba-1.5-large@001.

Relevant log output

No response

Twitter / LinkedIn details

No response

ishaan-jaff commented 2 months ago

working on this - we need to add support for their new chat endpoint

ishaan-jaff commented 2 months ago

Pr here: https://github.com/BerriAI/litellm/pull/5478