BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.62k stars 1.2k forks source link

[18/12/2023 - 24/12/2023] New Models/Endpoints/Providers #1193

Closed krrishdholakia closed 5 months ago

krrishdholakia commented 7 months ago

Tracking list of new models / endpoints / providers we plan on adding this week.

Comment any new models/providers/endpoints you want us to add below 👇

Manouchehri commented 7 months ago

Didn't gemini-pro already get support in LiteLLM?

krrishdholakia commented 7 months ago

@Manouchehri we support gemini-pro on vertex ai, i believe there's a separate gemini-pro on the palm api (you can access that via api keys)

krrishdholakia commented 7 months ago

async ollama embeddings support now added - should work for proxy as well - https://github.com/BerriAI/litellm/commit/eaaad7982343e8d6dbc547cedc9dae88999d8b86

krrishdholakia commented 6 months ago

Initial commit made that adds google gemini support for completion calls - https://github.com/BerriAI/litellm/commit/1262d89ab385d16220d1578a4908f53b9bc5a075

Need to add:

cc: @toniengelhardt