Open IRedDragonICY opened 2 months ago
Thanks😁👍🏻
@IRedDragonICY please report back any problems you might have, since I couldn't really test it, because I don't have a subscription. Thanks :pray: .
Hello @Blarc,
I have reviewed the new update and noticed that it currently supports only the Vertex format and not the Google AI Studio (Makersuite) format.
There are currently three models:
Here is an example from Google AI Studio: (cURL)
API_KEY="YOUR_API_KEY"
curl \
-X POST https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-pro:generateContent?key=${API_KEY} \
-H 'Content-Type: application/json' \
-d @<(echo '{
"contents": [
{
"role": "user",
"parts": [
{
"text": "INSERT_INPUT_HERE"
}
]
}
],
"generationConfig": {
"temperature": 1,
"topK": 64,
"topP": 0.95,
"maxOutputTokens": 8192,
"responseMimeType": "text/plain"
},
"safetySettings": [
{
"category": "HARM_CATEGORY_HARASSMENT",
"threshold": "BLOCK_MEDIUM_AND_ABOVE"
},
{
"category": "HARM_CATEGORY_HATE_SPEECH",
"threshold": "BLOCK_MEDIUM_AND_ABOVE"
},
{
"category": "HARM_CATEGORY_SEXUALLY_EXPLICIT",
"threshold": "BLOCK_MEDIUM_AND_ABOVE"
},
{
"category": "HARM_CATEGORY_DANGEROUS_CONTENT",
"threshold": "BLOCK_MEDIUM_AND_ABOVE"
}
]
}')
For Google AI Studio, we don't require Project ID and Location, only the Model and API key.
Thank you for your attention to this detail.
Hey @IRedDragonICY, it seems langchain4j does not support this format?
Unfortunately, it seems that Google AI Studio is not supported by langchain4j at this time, with only the Google Vertex format being compatible. As a temporary workaround, you might consider making HTTP requests manually for integration with Google AI Studio.
Unfortunately, it seems that Google AI Studio is not supported by langchain4j at this time, with only the Google Vertex format being compatible. As a temporary workaround, you might consider making HTTP requests manually for integration with Google AI Studio.
I would rather try to open an issue or PR to langchain4j and first add the integration there.
@IRedDragonICY you can use this and stage on vercel on your own, this is what i'm using since the plugin doesn't support yet Gemini.
@IRedDragonICY you can use this and stage on vercel on your own, this is what i'm using since the plugin doesn't support yet Gemini.
@MarJose123, Thanks for the suggestion! I’ve been using a similar solution for now, which is this repository: zhu327/gemini-openai-proxy. It's been working quite well as an interim measure until there's official support in the plugin.
@IRedDragonICY you can use this and stage on vercel on your own, this is what i'm using since the plugin doesn't support yet Gemini.
Thanks a lot for the incredibly easy workaround!
Just wanna add a bit more info for anyone like me who doesn't have much experience with Vercel:
Deploy
button under Deploy with Vercel, and follow all the instructions (login github, clone the repo, deploy the project, etc.). Then you will have a domain like xxxxx.vercel.app
v1
to the url and put it in field Host
in plugin settings page Edit LLM Client
, e.g. Host
https://xxxxx.vercel.app/v1
, paste your Gemini api key in field Token
, and then click Verify
and Update
Support Gemini API (from Vertex or Makersuite)╰(°▽°)╯