Open JungleMist opened 3 months ago
Does this PATH C:\Users\11319\tensorrt-llm\TensorRT-LLM in your PYTHONPATH env?
This problem is solved after I add the path to PYTHONPATH, but another problem raised.
(trt_llm) C:\Users\11319\tensorrt-llm\TensorRT-LLM\examples\llama>python3 convert_checkpoint.py --model_dir C:\Users\11319\tensorrt-llm\llama\llama-2-7b-chat --output_dir llama-2-7b-ckpt
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\examples\llama\convert_checkpoint.py", line 8, in <module>
import tensorrt_llm
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\tensorrt_llm\__init__.py", line 32, in <module>
import tensorrt_llm.functional as functional
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\tensorrt_llm\functional.py", line 28, in <module>
from . import graph_rewriting as gw
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\tensorrt_llm\graph_rewriting.py", line 12, in <module>
from .network import Network
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\tensorrt_llm\network.py", line 27, in <module>
from tensorrt_llm.module import Module
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\tensorrt_llm\module.py", line 17, in <module>
from ._common import default_net
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\tensorrt_llm\_common.py", line 31, in <module>
from ._utils import str_dtype_to_trt
File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\tensorrt_llm\_utils.py", line 30, in <module>
from tensorrt_llm.bindings.BuildInfo import ENABLE_MULTI_DEVICE
ModuleNotFoundError: No module named 'tensorrt_llm.bindings'
I think you may need to build the tensorrt-llm firstly and then retry.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 15 days."
System Info
-CPU architecture: amd64 -Operating System: Windows 11 -Python version: 3.11.5 -TensorRT-LLM version: 0.10.0 -CUDA version: 12.5 -torch version: 2.2.0+cu121
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
(trt_llm) C:\Users\11319\tensorrt-llm\TensorRT-LLM\examples\llama>python3 convert_checkpoint.py --model_dir C:\Users\113 19\tensorrt-llm\llama\llama-2-7b-chat --output_dir llama-2-7b-ckpt [TensorRT-LLM] TensorRT-LLM version: 0.10.0 Traceback (most recent call last): File "C:\Users\11319\tensorrt-llm\TensorRT-LLM\examples\llama\convert_checkpoint.py", line 12, in
from tensorrt_llm.models import LLaMAConfig, LLaMAForCausalLM
ImportError: cannot import name 'LLaMAConfig' from 'tensorrt_llm.models' (C:\Users\11319.conda\envs\trt_llm\lib\site-packages\tensorrt_llm\models__init__.py)
Expected behavior
The llama-2-7b-ckpt directory will be created as expected.
actual behavior
The python script doesn't work.
additional notes
I just follow the tutorial provided by the offical.