meta-llama / llama-recipes

Scripts for fine-tuning Meta Llama with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting a number of candid inference solutions such as HF TGI, VLLM for local or cloud deployment. Demo apps to showcase Meta Llama for WhatsApp & Messenger.
15.14k stars 2.19k forks source link

upgrade typing_extensions version #645

Open lxning opened 2 months ago

lxning commented 2 months ago

System Info

PyTorch: 2.3 Cuda: 12.1

Information

🐛 Describe the bug

I got error when i ran the command generated from python prepare_meta_eval.py --config_path ./eval_config.yaml. The root cause is typing-extensions==4.8.0, but vllm is based on typing_extensions >= 4.10.

Error logs

lm_eval --model vllm   --model_args pretrained=meta-llama/Meta-Llama-3.1-8B-Instruct,tensor_parallel_size=1,dtype=auto,gpu_memory_utilization=0.9,data_parallel_size=4,max_model_len=8192,add_bos_token=True,seed=42 --tasks meta_instruct --batch_size auto --output_path eval_results --include_path /home/ubuntu/llama-recipes/tools/benchmarks/llm_eval_harness/meta_eval_reproduce/work_dir --seed 42  --log_samples

cannot import name 'TypeIs' from 'typing_extensions'

Expected behavior

the following command can run successfully.

lm_eval --model vllm   --model_args pretrained=meta-llama/Meta-Llama-3.1-8B-Instruct,tensor_parallel_size=1,dtype=auto,gpu_memory_utilization=0.9,data_parallel_size=4,max_model_len=8192,add_bos_token=True,seed=42 --tasks meta_instruct --batch_size auto --output_path eval_results --include_path /home/ubuntu/llama-recipes/tools/benchmarks/llm_eval_harness/meta_eval_reproduce/work_dir --seed 42  --log_samples
wukaixingxp commented 2 months ago

Hi! Thank you for the bug report, I think we should first install llama-recipe then install vllm which will override the typing_extensions version. Can you help me to verify if this modification works?

git clone git@github.com:meta-llama/llama-recipes.git
cd llama-recipes
pip install -U pip setuptools
pip install -e .
pip install lm-eval[math,ifeval,sentencepiece,vllm]==0.4.3
cd tools/benchmarks/llm_eval_harness/meta_eval_reproduce