langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
47.1k stars 6.69k forks source link

Add OpenAPI Completion API Error #5419

Closed violinday closed 1 month ago

violinday commented 3 months ago

Self Checks

Dify version

0.6.11

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

image request params:

{"model":"gpt-4o","model_type":"llm","credentials":{"mode":"completion","context_size":"4096","max_tokens_to_sample":"4096","function_calling_type":"no_call","stream_function_calling":"not_supported","vision_support":"support","stream_mode_delimiter":"\\n\\n","api_key":"[__HIDDEN__]","endpoint_url":"XXXXXXXx"},"load_balancing":{"enabled":false,"configs":[]}}

error response:

{
    "code": "invalid_param",
    "message": "Credentials validation failed with status code 404",
    "status": 400
}

✔️ Expected Behavior

add success.

❌ Actual Behavior

error response

dosubot[bot] commented 3 months ago

Hello, @violinday! I'm here to assist you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's work together to resolve the issue you're facing.

To resolve the "Credentials validation failed with status code 404" error when integrating an OpenAI-compatible API in your self-hosted Dify instance (version 0.6.11, running on Docker), ensure the following:

  1. Check endpoint_url: Ensure that the endpoint_url provided in your credentials is correct and accessible. It should point to the correct API endpoint of the OpenAI-compatible service you are using. The URL should not end with a trailing slash.

  2. Check mode: Ensure that the mode in your credentials is set correctly to either completion or chat. This determines the endpoint that will be used (completions or chat/completions).

  3. Verify API Key: Ensure that the api_key is correct and has the necessary permissions to access the API.

  4. Endpoint Availability: Make sure that the endpoint you are trying to reach is available and not returning a 404 error. You can test this by making a direct request to the endpoint using tools like curl or Postman.

Here is the relevant code snippet that shows how the credentials are validated:

headers = {
    'Content-Type': 'application/json'
}

api_key = credentials.get('api_key')
if api_key:
    headers["Authorization"] = f"Bearer {api_key}"

endpoint_url = credentials['endpoint_url']
if not endpoint_url.endswith('/'):
    endpoint_url += '/'

# prepare the payload for a simple ping to the model
data = {
    'model': model,
    'max_tokens': 5
}

completion_type = LLMMode.value_of(credentials['mode'])

if completion_type is LLMMode.CHAT:
    data['messages'] = [
        {
            "role": "user",
            "content": "ping"
        },
    ]
    endpoint_url = urljoin(endpoint_url, 'chat/completions')
elif completion_type is LLMMode.COMPLETION:
    data['prompt'] = 'ping'
    endpoint_url = urljoin(endpoint_url, 'completions')
else:
    raise ValueError("Unsupported completion type for model configuration.")

# send a post request to validate the credentials
response = requests.post(
    endpoint_url,
    headers=headers,
    json=data,
    timeout=(10, 300)
)

if response.status_code != 200:
    raise CredentialsValidateFailedError(
        f'Credentials validation failed with status code {response.status_code}')

Ensure that the endpoint_url is correctly set and accessible, and that the mode is correctly specified [1].

To continue talking to Dosu, mention @dosu.

jay-jjwu commented 2 months ago

i got the same issue, is it resolved?

jay-jjwu commented 2 months ago

Hello, @violinday! I'm here to assist you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's work together to resolve the issue you're facing.

To resolve the "Credentials validation failed with status code 404" error when integrating an OpenAI-compatible API in your self-hosted Dify instance (version 0.6.11, running on Docker), ensure the following:

  1. Check endpoint_url: Ensure that the endpoint_url provided in your credentials is correct and accessible. It should point to the correct API endpoint of the OpenAI-compatible service you are using. The URL should not end with a trailing slash.
  2. Check mode: Ensure that the mode in your credentials is set correctly to either completion or chat. This determines the endpoint that will be used (completions or chat/completions).
  3. Verify API Key: Ensure that the api_key is correct and has the necessary permissions to access the API.
  4. Endpoint Availability: Make sure that the endpoint you are trying to reach is available and not returning a 404 error. You can test this by making a direct request to the endpoint using tools like curl or Postman.

Here is the relevant code snippet that shows how the credentials are validated:

headers = {
    'Content-Type': 'application/json'
}

api_key = credentials.get('api_key')
if api_key:
    headers["Authorization"] = f"Bearer {api_key}"

endpoint_url = credentials['endpoint_url']
if not endpoint_url.endswith('/'):
    endpoint_url += '/'

# prepare the payload for a simple ping to the model
data = {
    'model': model,
    'max_tokens': 5
}

completion_type = LLMMode.value_of(credentials['mode'])

if completion_type is LLMMode.CHAT:
    data['messages'] = [
        {
            "role": "user",
            "content": "ping"
        },
    ]
    endpoint_url = urljoin(endpoint_url, 'chat/completions')
elif completion_type is LLMMode.COMPLETION:
    data['prompt'] = 'ping'
    endpoint_url = urljoin(endpoint_url, 'completions')
else:
    raise ValueError("Unsupported completion type for model configuration.")

# send a post request to validate the credentials
response = requests.post(
    endpoint_url,
    headers=headers,
    json=data,
    timeout=(10, 300)
)

if response.status_code != 200:
    raise CredentialsValidateFailedError(
        f'Credentials validation failed with status code {response.status_code}')

Ensure that the endpoint_url is correctly set and accessible, and that the mode is correctly specified [1].

To continue talking to Dosu, mention @dosu.

i got the same issue, and below the errors in log file:

[ERR] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | getAndValidateTextRequest failed: field messages is required [ERR] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | relay error happen, status code is 400, won't retry in this case [GIN] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | 400 | 739.832µs | 172.19.0.1 | POST /v1/chat/completions [ERR] 2024/07/07 - 09:36:17 | 2024070709361741415654574061408 | relay error (channel id 7, user id: 1): field messages is required (request id: 2024070709361741415654574061408)