lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
https://chat-preview.lobehub.com
Other
40.96k stars 9.34k forks source link

[Bug] 通义千问Qwen API使用报错 #3240

Open morningtzh opened 2 months ago

morningtzh commented 2 months ago

📦 部署环境

Other

📌 软件版本

v1.5.1

💻 系统环境

Windows, iOS

🌐 浏览器

Edge, Safari

🐛 问题描述

目前部署在Azure北美容器应用上,一直存在类似 #3108 的问题,但早期无论是否打开客户端请求模式都存在问题。目前最新版本v1.5.1已默认关闭客户端请求模式,依然存在问题。是否需要和 #3099 一样使用国际版API?

配置截图,聊天框也是同样返回: image

查看日志可以看到是fetch失败 2024-07-17T08:54:34.358024997Z Route: [qwen] ProviderBizError: [TypeError: fetch failed] { cause: [Error: AggregateError] }

📷 复现步骤

同问题描述

🚦 期望结果

期望能正常访问dashscope qwen模型

📝 补充信息

No response

lobehubbot commented 2 months ago

👀 @morningtzh

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


📦 Deployment environment

Other

📌 Software version

v1.5.1

💻 System environment

Windows, iOS

🌐 Browser

Edge, Safari

🐛 Problem description

Currently deployed on Azure North America container applications, there have been problems similar to #3108, but in the early days there were problems regardless of whether the client request mode was turned on or not. Currently, the latest version v1.5.1 has turned off the client request mode by default, but there are still problems. Do I need to use the international version of the API like #3099?

Configuration screenshot, the chat box also returns: image

📷 Steps to reproduce

Same problem description

🚦 Expected results

Expect to be able to access the dashscope qwen model normally

📝 Supplementary information

No response

arvinxx commented 2 months ago

fetch failed 应该就是连不通吧,感觉要有国际版

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


fetch failed probably means there is no connection. I feel like there needs to be an international version.

Mingholy commented 2 months ago

你的API-KEY是从哪里获取的呢?

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Where did you get your API-KEY?

morningtzh commented 2 months ago

fetch failed 应该就是连不通吧,感觉要有国际版

可能是,后面等另一个issue支持国际版了再试试

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


fetch failed probably means there is no connection. I feel like there needs to be an international version.

Maybe, I'll try again later when another issue supports the international version.

morningtzh commented 2 months ago

你的API-KEY是从哪里获取的呢?

key没有问题,我从环境变量中拷贝出来,qwen-max能curl通,正常得出结果

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Where did you get your API-KEY?

There is no problem with the key. I copied it from the environment variable. qwen-max can curl and the results are obtained normally.

Mingholy commented 2 months ago

3108 看下来应该是配置了一个自定义模型,首先使用了客户端请求;关闭客户端请求之后传入了一个不正确的模型名,传入了不支持的参数造成的报错。但从截图上看,至少网关是确定可以访问得到的。

参考#3254 其实并没有『国际专供』这一说,所以得排除下是否是网络环境的问题,得在部署的环境里测试下这两个endpoint能否curl得到: https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions 随便填API-KEY至少应该能拿到invalid_api_key

curl --location 'https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions' \
--header 'Authorization: Bearer sk-xxx' \
--header 'Content-Type: application/json' \
--data '{
    "model": "qwen-plus",
    "messages": [
        {
            "role": "system",
            "content": "You are a helpful assistant."
        },
        {
            "role": "user",
            "content": "你是谁"
        }
    ],
    "stream": false
}'
lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


3108 It seems that a custom model was configured, and the client request was first used; after closing the client request, an incorrect model name was passed in, and an error was reported due to unsupported parameters. But judging from the screenshots, at least the gateway is definitely accessible.

Reference #3254 In fact, there is no such thing as "international exclusive supply", so you have to rule out whether it is a problem with the network environment, and test whether these two endpoints can be obtained by curl in the deployment environment: https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions Fill in the API-KEY casually and you should at least be able to get invalid_api_key

curl --location 'https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions' \
--header 'Authorization: Bearer sk-xxx' \
--header 'Content-Type: application/json' \
--data '{
    "model": "qwen-plus",
    "messages": [
        {
            "role": "system",
            "content": "You are a helpful assistant."
        },
        {
            "role": "user",
            "content": "Who are you"
        }
    ],
    "stream": false
}'
morningtzh commented 2 months ago

@Mingholy 今天配置GPT4o mini时发现新版本 qwen已经可以访问了。不知道是否和你昨天提交的 #3252 有关

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@Mingholy When configuring GPT4o mini today, I found that the new version qwen is already accessible. I don’t know if it’s related to #3252 you submitted yesterday.