Codium-ai / pr-agent

🚀CodiumAI PR-Agent: An AI-Powered 🤖 Tool for Automated Pull Request Analysis, Feedback, Suggestions and More! 💻🔍
Apache License 2.0
6.03k stars 588 forks source link

Add Model Hyperparameter Configuration #1056

Closed h0rv closed 3 months ago

h0rv commented 3 months ago

Currently, for example, model temperature is hardcoded throughout the codebase to 0.2.

This is a reasonable temperature to have good responses with less hallucinations, however, now with open source models becoming more capable (Llama 3 405b), like prompting, model hyperparameters do not translate 1-1 from model to model.

So, I think it would be a good change to make the temperature, and potentially other parameters, configurable in the configuration.toml.

mrT23 commented 3 months ago

low temperature doesnt reduce hallucinations. if t=0 would have resulted in zero hallucinations, people would 100% of the time use t=0

In any case, due to different motivation (trying to support seed), I have implemented temperature as a parameter config https://github.com/Codium-ai/pr-agent/pull/1063

h0rv commented 3 months ago

Awesome, thank you!