langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
51.53k stars 7.45k forks source link

use azure o1-mini o1-preivew model save error and use chat error #9327

Closed zbjlala closed 4 weeks ago

zbjlala commented 1 month ago

Self Checks

Dify version

0.9.1

Cloud or Self Hosted

Self Hosted (Docker), Self Hosted (Source)

Steps to reproduce

1.add o1-mini o1-preivew model in azure save error error msg:Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'}} param temperature=0,max_tokens=20 not support

2.use o1-mini o1-preivew error _num_tokens_from_messages use tiktoken 0.7.0 not support o1 model 0.8.0support

✔️ Expected Behavior

azure o1 model use

❌ Actual Behavior

error and can't use

dosubot[bot] commented 1 month ago

@zbjlala Hello there! I'm here to assist you with any bugs, questions, and contributions while we wait for a human maintainer. Let's tackle this issue together!

I found a similar issue that was closed by a pull request addressing the Azure o1-mini and o1-preview models [1]. The error you're encountering with unsupported parameters like max_tokens can be resolved by using max_completion_tokens instead. Ensure your requests adhere to the parameter rules defined for these models.

For the tiktoken version issue, make sure to update tiktoken to version 0.8.0, which supports the o1 model [2].

To continue talking to Dosu, mention @dosu.

somethingwentwell commented 1 month ago

Check this PR: https://github.com/langgenius/dify/pull/9326