[X] I have read the README and searched the existing issues.
System Info
llamafactory-cli env会报错
Reproduction
Traceback (most recent call last):
File "/root/miniconda3/bin/llamafactory-cli", line 5, in
from llamafactory.cli import main
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/init.py", line 17, in
from .cli import VERSION
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/cli.py", line 22, in
from .api.app import run_api
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/api/app.py", line 21, in
from ..chat import ChatModel
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/chat/init.py", line 16, in
from .chat_model import ChatModel
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/chat/chat_model.py", line 25, in
from .vllm_engine import VllmEngine
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/chat/vllm_engine.py", line 29, in
from vllm.lora.request import LoRARequest
ModuleNotFoundError: No module named 'vllm.lora'
Reminder
System Info
llamafactory-cli env会报错
Reproduction
Traceback (most recent call last): File "/root/miniconda3/bin/llamafactory-cli", line 5, in
from llamafactory.cli import main
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/init.py", line 17, in
from .cli import VERSION
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/cli.py", line 22, in
from .api.app import run_api
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/api/app.py", line 21, in
from ..chat import ChatModel
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/chat/init.py", line 16, in
from .chat_model import ChatModel
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/chat/chat_model.py", line 25, in
from .vllm_engine import VllmEngine
File "/ML-A100/team/align/caichenglin/code/LLaMA-Factory/src/llamafactory/chat/vllm_engine.py", line 29, in
from vllm.lora.request import LoRARequest
ModuleNotFoundError: No module named 'vllm.lora'
Expected behavior
No response
Others
No response