Closed RaoHai closed 1 day ago
The latest updates on your projects. Learn more about Vercel for Git ↗︎
Name | Status | Preview | Comments | Updated (UTC) |
---|---|---|---|---|
petercat | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Nov 21, 2024 8:23am |
This pull request introduces support for configuring the temperature, number of responses (n
), and top_p
parameters in the bot's language model (LLM) configuration. These changes allow for more flexible and customizable interactions with the LLM by adjusting these parameters.
File | Summary |
---|---|
server/agent/bot/__init__.py |
Updated the LLM initialization to include temperature, n, and top_p from the bot configuration. |
server/agent/llm/__init__.py |
Refactored the LLM class to support new parameters and adjusted the client registry. |
server/agent/llm/base.py |
Modified the BaseLLMClient class to include new parameters in its constructor. |
server/agent/llm/clients/gemini.py |
Updated GeminiClient to handle new parameters in its initialization. |
server/agent/llm/clients/openai.py |
Enhanced the OpenAI client to support new parameters. |
server/core/models/bot.py |
Added temperature, n, and top_p attributes to the BotModel class. |