petercat-ai / petercat

A conversational Q&A agent configuration system, self-hosted deployment solutions, and a convenient all-in-one application SDK, allowing you to create intelligent Q&A bots for your GitHub repositories
https://petercat.ai
MIT License
594 stars 16 forks source link

feat: support config temperature in bot #496

Closed RaoHai closed 1 day ago

RaoHai commented 1 day ago
vercel[bot] commented 1 day ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
petercat ✅ Ready (Inspect) Visit Preview 💬 Add feedback Nov 21, 2024 8:23am
petercat-assistant[bot] commented 1 day ago

Walkthrough

This pull request introduces support for configuring the temperature, number of responses (n), and top_p parameters in the bot's language model (LLM) configuration. These changes allow for more flexible and customizable interactions with the LLM by adjusting these parameters.

Changes

File Summary
server/agent/bot/__init__.py Updated the LLM initialization to include temperature, n, and top_p from the bot configuration.
server/agent/llm/__init__.py Refactored the LLM class to support new parameters and adjusted the client registry.
server/agent/llm/base.py Modified the BaseLLMClient class to include new parameters in its constructor.
server/agent/llm/clients/gemini.py Updated GeminiClient to handle new parameters in its initialization.
server/agent/llm/clients/openai.py Enhanced the OpenAI client to support new parameters.
server/core/models/bot.py Added temperature, n, and top_p attributes to the BotModel class.