hiyouga / LLaMA-Factory

Unify Efficient Fine-Tuning of 100+ LLMs
Apache License 2.0
25.26k stars 3.13k forks source link

API_PORT=8000 llamafactory-cli api examples/inference/qwen2_vllm.yaml报错 #4555

Closed frostjsy closed 2 days ago

frostjsy commented 3 days ago

qwen2_vllm.yaml文件内容如下: model_name_or_path: exports/qwen2-1.5b_lora_sft template: qwen infer_backend: vllm vllm_enforce_eager: true

报错信息如下 Traceback (most recent call last): File "/data/c/envs/py_jsy/bin/llamafactory-cli", line 5, in from llamafactory.cli import main File "/data/jovyan/work/shuangye/LLaMA-Factory/src/llamafactory/init.py", line 3, in from .cli import VERSION File "/data/jovyan/work/shuangye/LLaMA-Factory/src/llamafactory/cli.py", line 7, in from . import launcher File "/data/jovyan/work/shuangye/LLaMA-Factory/src/llamafactory/launcher.py", line 1, in from llamafactory.train.tuner import run_exp File "/data/jovyan/work/shuangye/LLaMA-Factory/src/llamafactory/train/tuner.py", line 4, in from transformers import PreTrainedModel File "/data/c/envs/py_jsy/lib/python3.11/site-packages/transformers/init.py", line 26, in from . import dependency_versions_check File "/data/c/envs/py_jsy/lib/python3.11/site-packages/transformers/dependency_versions_check.py", line 16, in from .utils.versions import require_version, require_version_core File "/data/c/envs/py_jsy/lib/python3.11/site-packages/transformers/utils/init.py", line 33, in from .generic import ( File "/data/c/envs/py_jsy/lib/python3.11/site-packages/transformers/utils/generic.py", line 461, in import torch.utils._pytree as _torch_pytree File "/data/c/envs/py_jsy/lib/python3.11/site-packages/torch/utils/init.py", line 4, in from .throughput_benchmark import ThroughputBenchmark File "/data/c/envs/py_jsy/lib/python3.11/site-packages/torch/utils/throughput_benchmark.py", line 2, in import torch._C ModuleNotFoundError: No module named 'torch._C'

hiyouga commented 2 days ago

重装 torch