dusty-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
MIT License
1.87k stars 416 forks source link

[Bug]: Not able to import Llama index modules in tensor rt llms #564

Open rvssridatta opened 6 days ago

rvssridatta commented 6 days ago

Bug Description Even though I am following the latest documentation . I am still not able to import llama index.

Issue 1 Nvidia jetson container link:https://github.com/dusty-nv/jetson-containers?tab=readme-ov-file versions: ubuntu:22.04 cuda : 12.2 architecture --> Arm 64 jetpack - 6.0

1 photo added

please provide standard solution to deploy tensor rt llm integrated with some llama index rag modules. Device used : Advantech Jetson Orin NX - 16 GB variant

Version llama-index 0.10.50

**Steps to Reproduce

Issue 1:**

followed dusty nv documenation, steb by step provided commands. got same error mentioned in relevant logs, "$ jetson-containers run $(autotag tensorrt-llm)

Relevant Logs/Tracbacks Error: Looking in indexes: https://pypi.org/simple, https://pypi.nvidia.com Collecting tensorrt_llm==0.8.0 Downloading tensorrt-llm-0.8.0.tar.gz (6.9 kB) Preparing metadata (setup.py) ... error error: subprocess-exited-with-error

× python setup.py egg_info did not run successfully. │ exit code: 1 ╰─> [6 lines of output] Traceback (most recent call last): File "", line 2, in File "", line 34, in File "/tmp/pip-install-r7zpl9ve/tensorrt-llm_382951b6d5f34b8798d95f1967eb0620/setup.py", line 90, in raise RuntimeError("Bad params") RuntimeError: Bad params [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed

× Encountered error while generating package metadata. ╰─> See above for output.

note: This is an issue with the package mentioned above, not pip. hint: See above for details.

dusty-nv commented 5 days ago

@rvssridatta the TensorRT-LLM container is exploratory and not yet officially supported (hope soon). llama-index would be installed independently or with the llama-index container.