InternLM / lmdeploy

LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
https://lmdeploy.readthedocs.io/en/latest/
Apache License 2.0
3.11k stars 280 forks source link

react test evaluation config #1861

Closed zhulinJulia24 closed 3 days ago

zhulinJulia24 commented 3 days ago
  1. make pt_internlm2-chat-7b's max batch size to 64, becuase of OOM when max batch size is 128
  2. react config template of internlm2、llama3 and qwen1.5
  3. add config template of qwen2
  4. add a choice by using local eval config
  5. by using -r option when doing evaluation, if one model is failed we can rerun