BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

(stable nov 21st release) #6863

Open ishaan-jaff opened 6 hours ago

ishaan-jaff commented 6 hours ago

Stable release for Nov 21st

Fixes https://github.com/BerriAI/litellm/issues/6766

Title

Relevant issues

Type

๐Ÿ†• New Feature ๐Ÿ› Bug Fix ๐Ÿงน Refactoring ๐Ÿ“– Documentation ๐Ÿš„ Infrastructure โœ… Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

Description by Korbit AI

What change is being made?

Enhance JSON response handling by updating JSON schema structure and adding "supports_response_schema" support to various models, remove a redundant model, and introduce new unit tests for validation of JSON response formats.

Why are these changes being made?

The changes are aimed at improving the compatibility and robustness of JSON response formats by encapsulating schema properties within a "values" object, thereby maintaining consistency across different models. The redundant model "anthropic/claude-3-5-sonnet-20241022" was removed to avoid duplication. New unit tests have been added to ensure that models properly support response schema and handle potential errors gracefully, which are crucial for reliable model performance and testing of edge cases like Internal Server Errors.

Is this description stale? Ask me to generate a new description by commenting /korbit-generate-pr-description

vercel[bot] commented 6 hours ago

The latest updates on your projects. Learn more about Vercel for Git โ†—๏ธŽ

Name Status Preview Comments Updated (UTC)
litellm โœ… Ready (Inspect) Visit Preview ๐Ÿ’ฌ Add feedback Nov 22, 2024 8:13am