unit-mesh / auto-dev

🧙‍AutoDev: The AI-powered coding wizard(AI 驱动编程助手) with multilingual support 🌐, auto code generation 🏗️, and a helpful bug-slaying assistant 🐞! Customizable prompts 🎨 and a magic Auto Dev/Testing/Document/Agent feature 🧪 included! 🚀
https://ide.unitmesh.cc/
Mozilla Public License 2.0
2.76k stars 317 forks source link

Cannot read field "message" because "error,error" is null #168

Closed TingsongYu closed 5 months ago

TingsongYu commented 5 months ago

llm服务用vllm启动,server是:IP:9889, 现在配置的时候,点击Test LLM connection,报错为:Cannot read field "message" because "error,error" is null

请问是哪一步配置不正确吗? image

服务的接口文档如下: API:http://10.20.3.12:9889/v1/chat/completions 请求json:


data = {
    "model": "/Qwen_prj/llm_model/CodeQwen1.5-7B-Chat-AWQ",
    "messages": [
        {
            "role": "system",
            "content": "You are a helpful assistant."
        },
        {
            "role": "user",
            "content": "Tell me something about large language models."
        }
    ]
}

---
也参照了https://ide.unitmesh.cc/custom/llm-server进行配置,仍旧是点击Test LLM Connection的时候报::**Cannot read field "message" because "error,error" is null**

望回复,感谢!
phodal commented 5 months ago

应该是你这边 OpenAI API 模型接口使用的是旧的格式,两种格式不一样

TingsongYu commented 5 months ago

应该是你这边 OpenAI API 模型接口使用的是旧的格式,两种格式不一样

请问知道如何解决吗?

phodal commented 5 months ago

使用类似于这种方式:https://ide.unitmesh.cc/custom/llm-server#custom-llm-server-example 配置

{ "customFields": {"model": "yi-34b-chat", "stream": true},   "messageKeys": {"role": "role", "content": "content"} }
TingsongYu commented 5 months ago

使用类似于这种方式:https://ide.unitmesh.cc/custom/llm-server#custom-llm-server-example 配置

{ "customFields": {"model": "yi-34b-chat", "stream": true},   "messageKeys": {"role": "role", "content": "content"} }

已采用 custom llm server 方式配置,目前报错为: java.io.l0Exception: AutoDevHttpException(statusCode=400,message=okhtp3.internalhttp.RealResponseBod Prompt (Json): image

qingwen请问是request body有问题吗? request body如下:


{ "customFields": {"model": "/Qwen_prj/llm_model/CodeQwen1.5-7B-Chat-AWQ", "stream": true},   "messageKeys": {"role": "system", "content": "You are a helpful assistant."} }
phodal commented 5 months ago

哦,不对,你按格式来:

{ "customFields": {"model": "yi-34b-chat", "stream": true},   "messageKeys": {"role": "role", "content": "content"} }

这里的 role 和 content 是用来转换的,不需要改。

TingsongYu commented 5 months ago

是的,用官方文档建议的配置就可以了。 之前报错,应该是vllm后端服务的问题。感谢!