open-compass / opencompass

OpenCompass is an LLM evaluation platform, supporting a wide range of models (Llama3, Mistral, InternLM2,GPT-4,LLaMa2, Qwen,GLM, Claude, etc) over 100+ datasets.
https://opencompass.org.cn/
Apache License 2.0
3.8k stars 405 forks source link

[Bug] TurboMindModelwithChatTemplate中tm_model变量可能未初始化导致报错 #1384

Open PPnorain opened 1 month ago

PPnorain commented 1 month ago

Prerequisite

Type

I'm evaluating with the officially supported tasks/models/datasets.

Environment

python tools/prompt_viewer.py gsm8k_llama3_8b_ins.py

Reproduces the problem - code/configuration sample

gsm8k_llama3_8b_ins.py

from mmengine.config import read_base
from opencompass.partitioners import NaivePartitioner, SizePartitioner
from opencompass.tasks import OpenICLInferTask, OpenICLEvalTask
from opencompass.runners import LocalRunner

with read_base():
    # 数学
    from ..datasets.gsm8k.gsm8k_gen import gsm8k_datasets as data
    from .model_select import models

datasets = [*data]

infer = dict(
    partitioner=dict(type=SizePartitioner, max_task_size=5000),
    runner=dict(
        type=LocalRunner,
        max_num_workers=4,
        task=dict(type=OpenICLInferTask)),
)

eval = dict(
    partitioner=dict(type=NaivePartitioner),
    runner=dict(
        type=LocalRunner,
        task=dict(type=OpenICLEvalTask),
        max_num_workers=16,
    )
)
from opencompass.models import TurboMindModelwithChatTemplate

models = [
    dict(
        type=TurboMindModelwithChatTemplate,
        abbr='llama-3-8b-instruct-turbomind',
        path='meta-llama/Meta-Llama-3-8B-Instruct',
        engine_config=dict(max_batch_size=16, tp=1),
        gen_config=dict(top_k=1, temperature=1e-6, top_p=0.9, max_new_tokens=1024),
        max_seq_len=7168,
        max_out_len=1024,
        batch_size=16,
        run_cfg=dict(num_gpus=1),
        stop_words=['<|end_of_text|>', '<|eot_id|>'],
    )
]

Reproduces the problem - command or script

在主目录执行命令: python tools/prompt_viewer.py gsm8k_llama3_8b_ins.py 就会报错

Reproduces the problem - error message

IMG_6107

Other information

错误原因:turbomind_with_tf_above_v4_33.py文件第54到61行。如果tokenizer_only为True。导致tm_model无法被初始化,但是self.generators = [tm_model.create_instance() for i in range(concurrency)]调用了tm_model,导致报错。刚好prompt_viewer.py满足该条件。

tonysy commented 1 month ago

Thanks for the report, we will add this promblem into our backlog. For prompt viewer, you can use the HF version model configuration.