BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.08k stars 1.66k forks source link

Fixed #6797 Fireworks AI structured outputs #6802

Open DaveDeCaprio opened 3 days ago

DaveDeCaprio commented 3 days ago

Fix Fireworks AI Structured outputs

Relevant issues

Fixes #6797

Type

🐛 Bug Fix

Changes

Updated the fireworks_ai_transformation to use the format specified in the fireworks documentation: https://docs.fireworks.ai/structured-responses/structured-response-formatting

vercel[bot] commented 3 days ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Nov 19, 2024 3:26pm
DaveDeCaprio commented 2 days ago

Here is a screenshot of logs showing the initial OpenAI response_format being converted to fireworks.ai and making a successful call.

image

DaveDeCaprio commented 2 days ago
  • please add a unit test here: /test_fireworks_ai_translation.py
  • can you send a screenshot of a working request with your changes

Did a new commit that adds a test.