Closed zbjlala closed 4 weeks ago
@zbjlala Hello there! I'm here to assist you with any bugs, questions, and contributions while we wait for a human maintainer. Let's tackle this issue together!
I found a similar issue that was closed by a pull request addressing the Azure o1-mini
and o1-preview
models [1]. The error you're encountering with unsupported parameters like max_tokens
can be resolved by using max_completion_tokens
instead. Ensure your requests adhere to the parameter rules defined for these models.
For the tiktoken
version issue, make sure to update tiktoken
to version 0.8.0, which supports the o1
model [2].
To continue talking to Dosu, mention @dosu.
Check this PR: https://github.com/langgenius/dify/pull/9326
Self Checks
Dify version
0.9.1
Cloud or Self Hosted
Self Hosted (Docker), Self Hosted (Source)
Steps to reproduce
1.add o1-mini o1-preivew model in azure save error error msg:Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'}} param temperature=0,max_tokens=20 not support
2.use o1-mini o1-preivew error _num_tokens_from_messages use tiktoken 0.7.0 not support o1 model 0.8.0support
✔️ Expected Behavior
azure o1 model use
❌ Actual Behavior
error and can't use