Closed alexandreteles closed 5 months ago
should be fairly easy as new engine
I don't know if I can test this; I live in Canada so API access to the latest models is questionable at best. Would appreciate if someone living in a country that's not in the stone age of AI could test adding gemini support? I think it'd be good to have.
I don't know if I can test this; I live in Canada so API access to the latest models is questionable at best. Would appreciate if someone living in a country that's not in the stone age of AI could test adding gemini support? I think it'd be good to have.
I can share a key for testing if that would help but sadly I don't have the time to implement the feature myself.
With the release of the new Gemini 1.5 Flash model the Google LLMs have become more attractive while being competitive with alternatives like Mistral-Large. Although we can use alternatives like OpenRouter to run the requests against this model, having direct support would allow for cheaper and faster inference.