BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.94k stars 1.64k forks source link

[Feature]: JSON can be passed to the Gemini API #5766

Closed Gekko0114 closed 1 month ago

Gekko0114 commented 2 months ago

The Feature

When calling the Gemini API using Litellm, it is necessary to pass an API key. However, the raw Gemini API can also be called using JSON. Therefore, I would like to request that Litellm allows calling the Gemini API using JSON as well.

Motivation, pitch

Written in the feature section

Twitter / LinkedIn details

No response

krrishdholakia commented 2 months ago

However, the raw Gemini API can also be called using JSON

What does this mean? @Gekko0114 can you point to relevant docs here?

Gekko0114 commented 2 months ago

I mean that Gemini can be called using GOOGLE_APPLICATION_CREDENTIALS.

os.env["GOOGLE_APPLICATION_CREDENTIALS"] = "./google_credentials.json"
genai.GenerativeModel(
                model_name="xxx",
                generation_config="yyy",
                system_instruction="zzz",
            )
response = model.generate_content(messages)
krrishdholakia commented 2 months ago

this is already supported @Gekko0114

If you're seeing an issue, can you please share it

Gekko0114 commented 1 month ago

Really? Then I will try it