NVIDIA / TensorRT-Model-Optimizer

TensorRT Model Optimizer is a unified library of state-of-the-art model optimization techniques such as quantization, pruning, distillation, etc. It compresses deep learning models for downstream deployment frameworks like TensorRT-LLM or TensorRT to optimize inference speed on NVIDIA GPUs.
https://nvidia.github.io/TensorRT-Model-Optimizer
Other
574 stars 43 forks source link

When trying to use this on TritonSever, i have to install setuptools. #66

Open dongs0104 opened 2 months ago

dongs0104 commented 2 months ago

modelopt is already installed on tritonserver:24.08-trtllm but setuptools is not installed, so i get error when i use it

please add dependency

cjluo-omniml commented 2 months ago

Are you able to install setuptools as a separate step?