BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.16k stars 1.13k forks source link

fix(utils.py): allow dropping specific openai params #4313

Closed krrishdholakia closed 1 week ago

krrishdholakia commented 1 week ago

Title

allow dropping specific openai params

- litellm_params:
    api_base: my-base
    model: openai/my-model
    additional_drop_params: ["logit_bias"] # 👈 KEY CHANGE
  model_name: my-model

Relevant issues

Fixes issue where vllm would fail calls for specific logit_bias params

Type

🆕 New Feature

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

4 new unit tests added

Screenshot 2024-06-20 at 11 56 13 AM
vercel[bot] commented 1 week ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 20, 2024 9:15pm
Manouchehri commented 1 week ago

+1 to this, I want to use additional_drop_params: ["user"] for some embedding models.

krrishdholakia commented 1 week ago

oh - which ones? @Manouchehri