Generative AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.
Why
Users will be able to use multiple Gemini API keys to access the AI, allowing them to bypass rate limits and improve performance, especially when working with Beam and a single AI provider.
Description
Add support for using multiple Gemini API keys in the application.
Requirements
[ ] Environment Variable:
Add a new environment variable GEMINI_API_KEYS in src/server/env.mjs to store a comma-separated list of API keys.
[ ] Key Selection Logic:
Modify the line const geminiKey = access.geminiKey || env.GEMINI_API_KEY || ''; in src/modules/llms/server/gemini/gemini.router.ts to:
let geminiKey: string;
if (access.geminiKey.includes(',')) {
// If access.geminiKey includes commas, treat it as a list of keys and choose a random one
const keys = access.geminiKey.split(',').map(key => key.trim()).filter(key => key.length > 0);
geminiKey = keys[Math.floor(Math.random() * keys.length)];
} else {
// Otherwise, treat it as a single key
geminiKey = access.geminiKey;
}
Why Users will be able to use multiple Gemini API keys to access the AI, allowing them to bypass rate limits and improve performance, especially when working with Beam and a single AI provider.
Description Add support for using multiple Gemini API keys in the application.
Requirements
GEMINI_API_KEYS
insrc/server/env.mjs
to store a comma-separated list of API keys.const geminiKey = access.geminiKey || env.GEMINI_API_KEY || '';
insrc/modules/llms/server/gemini/gemini.router.ts
to: