langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
51.82k stars 7.51k forks source link

fix: Azure OpenAI o1 max_completion_token error #10593

Closed Kevin9703 closed 4 days ago

Kevin9703 commented 4 days ago

Checklist:

[!IMPORTANT]
Please review the checklist below before submitting your pull request.

Description

Fix https://github.com/langgenius/dify/issues/9746

The deployment name of our o1 model is 'gpt-o1'. When adding the model, we encountered an error: "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead." After reviewing the code, we found that the logic checks if the model name starts with 'o1'. If the deployment name doesn't start with 'o1', the model cannot be added properly. Therefore, the logic should be changed to check if 'o1' exists anywhere in the model name instead of just at the beginning.

image

Type of Change

Testing Instructions

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration