langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
50.12k stars 7.17k forks source link

Azure openai chat :403 Forbidden #9481

Closed dale22 closed 1 week ago

dale22 commented 1 week ago

Self Checks

Dify version

0.9.2

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

1,Setup azure openai model is ok 2,but use this model chat, docker-api logs:403 logs;docker-api-1 2024-10-18 04:16:55,655.655 INFO [Thread-4 (_generate_worker)] [_client.py:1038] - HTTP Request: POST https://xxxxxx/openai/deployments/GPT-4o-2024-05-13/chat/completions?api-version=2024-05-01-preview "HTTP/1.1 403 Forbidden"

✔️ Expected Behavior

return result

❌ Actual Behavior

return 403

dosubot[bot] commented 1 week ago

Hello @dale22! I'm here to help you with any bugs, questions, or contributions you have regarding Dify. Let's tackle this issue together!

I found some similar discussions and issues related to the 403 Forbidden error when using Azure OpenAI with Dify:

  1. Help: I attempted to add my OpenAI key to my self-hosted Dify but was unsuccessful. - This discussion is unsolved and might provide some insights into similar issues.
  2. Connection error when set up model provider - Another unsolved discussion that might be relevant.
  3. APIError: HTTP code 403 from API - This issue is closed, but it might contain useful information.

To troubleshoot and resolve the 403 Forbidden error, consider the following steps:

These steps should help you identify and resolve the issue. If the problem persists, reviewing the logs for more detailed error messages might provide additional clues.

To continue talking to Dosu, mention @dosu.

dale22 commented 1 week ago

1,set model is ok,so API Key and Endpoint and deployment name is ok image 2,Curl requests on the same machine are fine image 3,but dify chat is error, docker-api logs:403 image

dale22 commented 1 week ago

chat return image

crazywoola commented 1 week ago

Please check the deployment in Azure Console, the 403 error is not raised by us.

dale22 commented 1 week ago

Azure Console, feedback,Streaming requests are not yet allowed. open ai is not support streaming , how can I close streaming param. [ 2024-10-21 08:05:21,141.141 ERROR [Thread-141 (_generate_worker)] [logging_callback.py:169] - Error code: 403 - {'message': 'Streaming requests are not yet allowed.'} Traceback (most recent call last): ]

dale22 commented 1 week ago

2024-10-21 08:05:21,141.141 ERROR [Thread-141 (_generate_worker)] [logging_callback.py:169] - Error code: 403 - {'message': 'Streaming requests are not yet allowed.'} Traceback (most recent call last): File "/app/api/core/model_runtime/model_providers/__base/large_language_model.py", line 110, in invoke result = self._invoke(model, credentials, prompt_messages, model_parameters, tools, stop, stream, user) File "/app/api/core/model_runtime/model_providers/azure_openai/llm/llm.py", line 55, in _invoke return self._chat_generate( File "/app/api/core/model_runtime/model_providers/azure_openai/llm/llm.py", line 331, in _chat_generate response = client.chat.completions.create( File "/app/api/.venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper return func(*args, **kwargs) File "/app/api/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 590, in create return self._post( File "/app/api/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/app/api/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request return self._request( File "/app/api/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.PermissionDeniedError: Error code: 403 - {'message': 'Streaming requests are not yet allowed.'}