Open lxning opened 2 months ago
Hi! Thank you for the bug report, I think we should first install llama-recipe then install vllm which will override the typing_extensions version. Can you help me to verify if this modification works?
git clone git@github.com:meta-llama/llama-recipes.git
cd llama-recipes
pip install -U pip setuptools
pip install -e .
pip install lm-eval[math,ifeval,sentencepiece,vllm]==0.4.3
cd tools/benchmarks/llm_eval_harness/meta_eval_reproduce
System Info
PyTorch: 2.3 Cuda: 12.1
Information
🐛 Describe the bug
I got error when i ran the command generated from
python prepare_meta_eval.py --config_path ./eval_config.yaml
. The root cause is typing-extensions==4.8.0, but vllm is based on typing_extensions >= 4.10.Error logs
Expected behavior
the following command can run successfully.