Open WIZARDELF opened 2 weeks ago
You can use any OpenAI compatible services by customizing ancilla-adaptor-chat-api-endpoint
variable, which defaults to "https://api.openai.com/v1/chat/completions"
.
We need to change code. It seems that Gemini does not use the "Authorization: Bearer" header of OAuth 2.0. Gemini API keys are passed as part of the URL query "?key=$GEMINI_API_KEY".
F.Y.I:
curl "https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-pro:generateContent?key=$GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"contents": [{
"parts": [
{"text": "Why do we not abstract light from mango?"}
]
}],
"generationConfig": {
"response_mime_type": "application/json"
}
}'
unfortunately i don't have a key for gemini and not interested in getting one at the moment.
In https://github.com/shouya/ancilla.el/commit/a22bdbdab1c4ed4f791acaa378667f8ecac65ada I made the necessary change to decouple the openai requesting logic to a standalone function. so gemini support would be easy:
'gemini
to the list of options for backend: https://github.com/shouya/ancilla.el/commit/a22bdbdab1c4ed4f791acaa378667f8ecac65ada#diff-9b4db2d57695a6633d5a249a88415c37a4dc8324e87d7952c86e750be19a5415R44-R49'gemini
like https://github.com/shouya/ancilla.el/commit/a22bdbdab1c4ed4f791acaa378667f8ecac65ada#diff-9b4db2d57695a6633d5a249a88415c37a4dc8324e87d7952c86e750be19a5415R716
It will be nice to have non-openai models support.