danny-avila / LibreChat

Enhanced ChatGPT Clone: Features Anthropic, AWS, OpenAI, Assistants API, Azure, Groq, o1, GPT-4o, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, langchain, DALL-E-3, ChatGPT Plugins, OpenAI Functions, Secure Multi-User System, Presets, completely open-source for self-hosting. Actively in public development.
https://librechat.ai/
MIT License
18.94k stars 3.15k forks source link

[Bug]: Error With Azure When Using LibreChat.yml #3029

Closed IAmStoxe closed 5 months ago

IAmStoxe commented 5 months ago

What happened?

Originally opened #3027 which was told that Librechat.yaml was the new way of configuring Azure OpenAI.

I modified my code to utilize it, and removed extraneous configuration variables from .env but I still cannot communicate with Azure OpenAI. I now get:

Something went wrong. Here's the specific error message we encountered: An error occurred while processing your request. Please contact the Admin.

This is my librechat.yaml:

# Configuration version (required)
version: 1.0.9
# Cache settings: Set to true to enable caching
cache: true
endpoints:
  azureOpenAI:
    # Endpoint-level configuration
    titleModel: 'GPT-35-Turbo'
    plugins: true
    assistants: false
    groups:
      # Group-level configuration
      - group: 'REDACTED'
        apiKey: '${AZURE_OPENAI_API_KEY}'
        instanceName: 'REDACTED'
        version: '2024-04-01-preview'
        # Model-level configuration
        models:
          GPT-3.5-Turbo:
            deploymentName: GPT-35-Turbo
            version: '2024-04-01-preview'
          GPT-4:
            deploymentName: GPT-4
            version: '2024-04-01-preview'

Below is the debug console log I see. I notice it says model_error but that doesn't seem to be the problem?

Steps to Reproduce

  1. Configure Librechat for azure as I did above via librechat.yaml
  2. Send message via UI
  3. Notice error

What browsers are you seeing the problem on?

No response

Relevant log output

2024-06-10 21:25:13 info: Server listening on all interfaces at port 3080. Use http://localhost:3080 to access it
2024-06-10 21:25:13 info: Agenda successfully started and connected to MongoDB.

https://REDACTED.openai.azure.com/openai/deployments/GPT-35-Turbo/chat/completions?api-version=2024-04-01-preview
{
  model: 'GPT-3.5-Turbo',
  temperature: 0.8,
  top_p: 1,
  presence_penalty: 0,
  frequency_penalty: 0,
  stop: undefined,
  user: '66672ea9c21ef9167e482b66',
  stream: true,
  prompt: [ { role: 'user', content: 'testy' } ]
}

Response {
  [Symbol(realm)]: null,
  [Symbol(state)]: {
    aborted: false,
    rangeRequested: false,
    timingAllowPassed: true,
    requestIncludesCredentials: true,
    type: 'default',
    status: 400,
    timingInfo: {
      startTime: 401516.1067728996,
      redirectStartTime: 0,
      redirectEndTime: 0,
      postRedirectStartTime: 401516.1067728996,
      finalServiceWorkerStartTime: 0,
      finalNetworkResponseStartTime: 0,
      finalNetworkRequestStartTime: 0,
      endTime: 0,
      encodedBodySize: 22,
      decodedBodySize: 22,
      finalConnectionTimingInfo: null
    },
    cacheState: '',
    statusText: 'model_error',
    headersList: HeadersList {
      cookies: null,
      [Symbol(headers map)]: [Map],
      [Symbol(headers map sorted)]: null
    },
    urlList: [ URL {} ],
    body: { stream: undefined }
  },
  [Symbol(headers)]: HeadersList {
    cookies: null,
    [Symbol(headers map)]: Map(15) {
      'content-length' => [Object],
      'content-type' => [Object],
      'x-content-type-options' => [Object],
      'x-ms-rai-invoked' => [Object],
      'x-request-id' => [Object],
      'ms-azureml-model-error-reason' => [Object],
      'ms-azureml-model-error-statuscode' => [Object],
      'x-ms-client-request-id' => [Object],
      'apim-request-id' => [Object],
      'azureml-model-session' => [Object],
      'strict-transport-security' => [Object],
      'x-ms-region' => [Object],
      'x-ratelimit-remaining-requests' => [Object],
      'x-ratelimit-remaining-tokens' => [Object],
      'date' => [Object]
    },
    [Symbol(headers map sorted)]: null
  }
}
Error: Failed to send message. HTTP 400 - Unsupported data type

    at onopen (/media/data/home/dstokes/LibreChat2/api/app/clients/ChatGPTClient.js:356:25)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async create (/media/data/home/dstokes/LibreChat2/node_modules/@waylaidwanderer/fetch-event-source/lib/cjs/fetch.cjs:56:17) {
  status: 400
}
2024-06-10 21:31:52 error: [handleAbortError] AI response error; aborting request: Failed to send message. HTTP 400 - Unsupported data type

Screenshots

image

Code of Conduct

IAmStoxe commented 5 months ago

I didn't mean to offend or do anythign wrong @danny-avila - Was it bad to open up a new issue with the proper details?

danny-avila commented 5 months ago

There's no offense here. Github issues are meant for planned features, as well as reproducible problems/bugs.

When configured correctly, Azure is working fine. Please keep troubleshooting within the discussion thread opened:

https://github.com/danny-avila/LibreChat/discussions/3027

image