You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I got error when i ran the command generated from python prepare_meta_eval.py --config_path ./eval_config.yaml. The root cause is typing-extensions==4.8.0, but vllm is based on typing_extensions >= 4.10.
Error logs
lm_eval --model vllm --model_args pretrained=meta-llama/Meta-Llama-3.1-8B-Instruct,tensor_parallel_size=1,dtype=auto,gpu_memory_utilization=0.9,data_parallel_size=4,max_model_len=8192,add_bos_token=True,seed=42 --tasks meta_instruct --batch_size auto --output_path eval_results --include_path /home/ubuntu/llama-recipes/tools/benchmarks/llm_eval_harness/meta_eval_reproduce/work_dir --seed 42 --log_samples
cannot import name 'TypeIs' from 'typing_extensions'
Hi! Thank you for the bug report, I think we should first install llama-recipe then install vllm which will override the typing_extensions version. Can you help me to verify if this modification works?
git clone git@github.com:meta-llama/llama-recipes.git
cd llama-recipes
pip install -U pip setuptools
pip install -e .
pip install lm-eval[math,ifeval,sentencepiece,vllm]==0.4.3
cd tools/benchmarks/llm_eval_harness/meta_eval_reproduce
System Info
PyTorch: 2.3
Cuda: 12.1
Information
🐛 Describe the bug
I got error when i ran the command generated from
python prepare_meta_eval.py --config_path ./eval_config.yaml
. The root cause is typing-extensions==4.8.0, but vllm is based on typing_extensions >= 4.10.Error logs
Expected behavior
the following command can run successfully.
The text was updated successfully, but these errors were encountered: