Closed violinday closed 1 month ago
Hello, @violinday! I'm here to assist you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's work together to resolve the issue you're facing.
To resolve the "Credentials validation failed with status code 404" error when integrating an OpenAI-compatible API in your self-hosted Dify instance (version 0.6.11, running on Docker), ensure the following:
Check endpoint_url
: Ensure that the endpoint_url
provided in your credentials is correct and accessible. It should point to the correct API endpoint of the OpenAI-compatible service you are using. The URL should not end with a trailing slash.
Check mode
: Ensure that the mode
in your credentials is set correctly to either completion
or chat
. This determines the endpoint that will be used (completions
or chat/completions
).
Verify API Key: Ensure that the api_key
is correct and has the necessary permissions to access the API.
Endpoint Availability: Make sure that the endpoint you are trying to reach is available and not returning a 404 error. You can test this by making a direct request to the endpoint using tools like curl
or Postman.
Here is the relevant code snippet that shows how the credentials are validated:
headers = {
'Content-Type': 'application/json'
}
api_key = credentials.get('api_key')
if api_key:
headers["Authorization"] = f"Bearer {api_key}"
endpoint_url = credentials['endpoint_url']
if not endpoint_url.endswith('/'):
endpoint_url += '/'
# prepare the payload for a simple ping to the model
data = {
'model': model,
'max_tokens': 5
}
completion_type = LLMMode.value_of(credentials['mode'])
if completion_type is LLMMode.CHAT:
data['messages'] = [
{
"role": "user",
"content": "ping"
},
]
endpoint_url = urljoin(endpoint_url, 'chat/completions')
elif completion_type is LLMMode.COMPLETION:
data['prompt'] = 'ping'
endpoint_url = urljoin(endpoint_url, 'completions')
else:
raise ValueError("Unsupported completion type for model configuration.")
# send a post request to validate the credentials
response = requests.post(
endpoint_url,
headers=headers,
json=data,
timeout=(10, 300)
)
if response.status_code != 200:
raise CredentialsValidateFailedError(
f'Credentials validation failed with status code {response.status_code}')
Ensure that the endpoint_url
is correctly set and accessible, and that the mode
is correctly specified [1].
To continue talking to Dosu, mention @dosu.
i got the same issue, is it resolved?
Hello, @violinday! I'm here to assist you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's work together to resolve the issue you're facing.
To resolve the "Credentials validation failed with status code 404" error when integrating an OpenAI-compatible API in your self-hosted Dify instance (version 0.6.11, running on Docker), ensure the following:
- Check
endpoint_url
: Ensure that theendpoint_url
provided in your credentials is correct and accessible. It should point to the correct API endpoint of the OpenAI-compatible service you are using. The URL should not end with a trailing slash.- Check
mode
: Ensure that themode
in your credentials is set correctly to eithercompletion
orchat
. This determines the endpoint that will be used (completions
orchat/completions
).- Verify API Key: Ensure that the
api_key
is correct and has the necessary permissions to access the API.- Endpoint Availability: Make sure that the endpoint you are trying to reach is available and not returning a 404 error. You can test this by making a direct request to the endpoint using tools like
curl
or Postman.Here is the relevant code snippet that shows how the credentials are validated:
headers = { 'Content-Type': 'application/json' } api_key = credentials.get('api_key') if api_key: headers["Authorization"] = f"Bearer {api_key}" endpoint_url = credentials['endpoint_url'] if not endpoint_url.endswith('/'): endpoint_url += '/' # prepare the payload for a simple ping to the model data = { 'model': model, 'max_tokens': 5 } completion_type = LLMMode.value_of(credentials['mode']) if completion_type is LLMMode.CHAT: data['messages'] = [ { "role": "user", "content": "ping" }, ] endpoint_url = urljoin(endpoint_url, 'chat/completions') elif completion_type is LLMMode.COMPLETION: data['prompt'] = 'ping' endpoint_url = urljoin(endpoint_url, 'completions') else: raise ValueError("Unsupported completion type for model configuration.") # send a post request to validate the credentials response = requests.post( endpoint_url, headers=headers, json=data, timeout=(10, 300) ) if response.status_code != 200: raise CredentialsValidateFailedError( f'Credentials validation failed with status code {response.status_code}')
Ensure that the
endpoint_url
is correctly set and accessible, and that themode
is correctly specified [1].To continue talking to Dosu, mention @dosu.
i got the same issue, and below the errors in log file:
[ERR] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | getAndValidateTextRequest failed: field messages is required [ERR] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | relay error happen, status code is 400, won't retry in this case [GIN] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | 400 | 739.832µs | 172.19.0.1 | POST /v1/chat/completions [ERR] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | relay error (channel id 7, user id: 1): field messages is required (request id: 2024070709361741415654574061408)
Self Checks
Dify version
0.6.11
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
request params:
error response:
✔️ Expected Behavior
add success.
❌ Actual Behavior
error response