modelscope / evalscope

A streamlined and customizable framework for efficient large model evaluation and performance benchmarking
https://evalscope.readthedocs.io/en/latest/
Apache License 2.0
263 stars 33 forks source link

带参数模版的使用方法,这样有问题么? #160

Closed Devliang24 closed 3 weeks ago

Devliang24 commented 4 weeks ago

我预期发送参数模版

{
    "messages": [
        {
            "content": "xxxxxx ", // 所有查询将使用此prompt
            "role": "system"
        },
        {
            "content": "xxxxx", // 问题
            "role": "user"
        }
    ],
    "stream": true,
    "model": "Qwen2.5-72B-Instruct"
}

./datasets/my_data.jsonl 数据集内容格式

{"content":"xxxxxx ","role":"system"},{"content":"xxxxx","role":"user"}
{"content":"xxxxxx ","role":"system"},{"content":"xxxxx","role":"user"}
{"content":"xxxxxx ","role":"system"},{"content":"xxxxx","role":"user"}

执行命令:

evalscope perf --url 'http://ip:host/v1/chat/completions' --parallel 12 --model 'Qwen2.5-72B-Instruct' --log-every-n-query 10 --read-timeout=120 -n 1 --max-prompt-length 128000 --api openai --query-template '{"model": "%m", "messages": [], "stream": true}' --dataset-path './datasets/my_data.jsonl'

lxline commented 4 weeks ago

目前有一个 bug,应该能用 https://github.com/modelscope/evalscope/pull/161 修复。

Devliang24 commented 4 weeks ago

感谢,明天可以用上嘛?

lxline commented 4 weeks ago

如果您现在就需要使用的话,也可以尝试把 stream 参数放在 query-template 外。

--query-template '{"model": "%m", "messages": []}' --stream'

这样的形式应该是正确的。

Devliang24 commented 4 weeks ago

thx