Open traderpedroso opened 2 days ago
could you add embbending endpoint compatible to openai there is a proxy for that in go https://github.com/cheahjs/gemini-to-openai-proxy and a option to send history
example js when using langchain or redis chat memory
const { GoogleGenerativeAI, HarmCategory, HarmBlockThreshold, } = require("@google/generative-ai"); const apiKey = process.env.GEMINI_API_KEY; const genAI = new GoogleGenerativeAI(apiKey); const model = genAI.getGenerativeModel({ model: "gemini-1.5-flash", systemInstruction: "" }); const generationConfig = { temperature: 1, topP: 0.95, topK: 64, maxOutputTokens: 8192, responseMimeType: "text/plain", }; async function run() { const chatSession = model.startChat({ generationConfig, // safetySettings: Adjust safety settings // See https://ai.google.dev/gemini-api/docs/safety-settings history: [ { role: "user", parts: [ {text: "user history"}, ], }, { role: "model", parts: [ {text: "model history "}, ], }, ], }); const result = await chatSession.sendMessage("INSERT_INPUT_HERE"); console.log(result.response.text()); } run();```
could you add embbending endpoint compatible to openai there is a proxy for that in go https://github.com/cheahjs/gemini-to-openai-proxy and a option to send history
example js when using langchain or redis chat memory