Closed lucaseatp closed 1 month ago
I found a similar issue related to a 404 error when using the o1-preview model in Dify version 0.8.1 with a self-hosted Docker setup. The issue is likely because the Ollama service is not accessible from the Docker container.
To resolve this, you need to expose the Ollama service to the network. Instead of using localhost
, you should connect to host.docker.internal
in the service URL. Update the Base URL in Dify's settings to:
http://host.docker.internal:11434
This change should make the Ollama service accessible from the Docker container, resolving the 404 error [1].
Additionally, ensure that the Dify container can access the LocalAI endpoint if you are integrating with LocalAI for local model deployment. Use the host IP address instead of localhost
. For example, if LocalAI is running on http://127.0.0.1:8080
, use the host domain or LAN IP address like http://192.168.1.100:8080
in Dify's settings [2].
For more detailed information on resolving local deployment issues, you can refer to the Self Host / Local Deployment Frequently Asked Questions (FAQs) [3].
To continue talking to Dosu, mention @dosu.
If you're using your own API key, please note that only users with Tier 5 or higher have access to this model.
Here is a document: https://platform.openai.com/docs/guides/reasoning
Self Checks
Dify version
0.8.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
update to 0.8.1 and try to use o1-preview
✔️ Expected Behavior
o1-preview to run normally
❌ Actual Behavior
Run failed: [openai] Bad Request Error, Error code: 404 - {'error': {'message': 'The model o1-preview does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}