Closed TheValkyrja closed 2 months ago
Thanks, when the params are "shared," the previous values are allowed mid-conversation but in this case, they are only shared in name, and some discrepancy needs to be made for a more seamless experience.
This appears to be the case for mid-convo switching from OpenAI (only tested gpt-4o) initiated conversations that switch to Anthropic models, as well.
What happened?
when using switching Endpoints mid-conversation function after generating a response with a model, the parameter didn't dynamic change to new endpoint's default value. for example, max context will remain 8192 when changing gemini-1.5-pro to claude-3-opus(so as other parameter, top P, top K etc.), and this will cause error.
Steps to Reproduce
1.use Gemini-1.5-flash-latest, with its default parameter value: (temperature 0.2, top P 0.8, top K 40 and max output tokens 8192) to generate a response. 2.change the conversation model to Anthropic Claude-3-Opus with out setting parameter value. 3.model returned with an error message says unsupported value.
What browsers are you seeing the problem on?
Chrome
Relevant log output
No response
Screenshots
as you can see, this is the default value for gemini, but wrongly applied to claude.
Code of Conduct