BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.61k stars 1.2k forks source link

[Feature]: Enable passing provider-specific parameters #526

Closed krrishdholakia closed 9 months ago

krrishdholakia commented 9 months ago

The Feature

Allow me to pass parameters supported by that specific provider

Motivation, pitch

Not all providers support the same parameters. The **kwargs implementation is a hack, but still introduces conditional logic. Enable users to pass provider-specific parameters (e.g. top_k for anthropic).

Twitter / LinkedIn details

No response

krrishdholakia commented 9 months ago

Picking this up now.

List of providers we need to add support for:

krrishdholakia commented 9 months ago
Screenshot 2023-10-05 at 2 08 09 PM
krrishdholakia commented 9 months ago

Got the integrations written. Now writing unit tests.

krrishdholakia commented 9 months ago

Pushed dd7e397650acbe18ec8e5610ab05fc584a3c73c9

Will update ticket once this is in prod

krrishdholakia commented 9 months ago

in prod.