Open felixkarevo opened 1 month ago
If you dont need tensorrt-llm, you can remove steps related to it in docker/Dockerfile
. But note that Model Optimizer is not yet officially supported for Arm. We will officially support it in next release later this month. But you are free to use previous version on Arm (not Jetson Orin). Most features should still work.
Note that from our latest release 0.19.0, we officially support SBSA Arm (not Jetson Orin)
Any chance you will provide a Docker container for ARM64 compatibility? The main issue here is that TensorRT-LLM is not compatible with my ARM64 (aarch64) architecture.
When building the Model Optimizer example docker container I get this error
ERROR: failed to solve: process "/bin/sh -c pip install \"tensorrt-llm~=$TRT_LLM_VERSION\" -U" did not complete successfully: exit code: 1
.