Closed krrishdholakia closed 5 months ago
Didn't gemini-pro already get support in LiteLLM?
@Manouchehri we support gemini-pro on vertex ai, i believe there's a separate gemini-pro on the palm api (you can access that via api keys)
async ollama embeddings support now added - should work for proxy as well - https://github.com/BerriAI/litellm/commit/eaaad7982343e8d6dbc547cedc9dae88999d8b86
Initial commit made that adds google gemini support for completion calls - https://github.com/BerriAI/litellm/commit/1262d89ab385d16220d1578a4908f53b9bc5a075
Need to add:
cc: @toniengelhardt
Tracking list of new models / endpoints / providers we plan on adding this week.
Comment any new models/providers/endpoints you want us to add below 👇