Open gleachkr opened 5 days ago
Hey @gleachkr that is great point!
How about using temp between [0-1] float and then scale to each of the LLM providers (Anthropic, OpenAI)?
Would that work?
Makes sense to me!
On Mon Nov 18, 2024 at 1:39 PM EST, Dario A Lencina-Talarico wrote:
Hey @gleachkr that is great point!
How about using temp between [0-1] float and then scale to each of the LLM providers (Anthropic, OpenAI)?
Would that work?
Temperatures are currently represented by a u32. But OpenAI expects a number (decimal) between 0 and 2, and Anthropic expects a number between 0 and 1. Maybe it would make sense to normalize the temperatures per-LLM before sending off the API requests?