yuvraj108c / ComfyUI-Dwpose-Tensorrt

Ultra fast dwpose estimation inside comfyui using tensorrt
Other
21 stars 1 forks source link
comfyui comfyui-nodes dwpose tensorrt
# ComfyUI Dwpose TensorRT [![python](https://img.shields.io/badge/python-3.10.12-green)](https://www.python.org/downloads/release/python-31012/) [![cuda](https://img.shields.io/badge/cuda-12.4-green)](https://developer.nvidia.com/cuda-downloads) [![trt](https://img.shields.io/badge/TRT-10.0-green)](https://developer.nvidia.com/tensorrt) [![by-nc-sa/4.0](https://img.shields.io/badge/license-CC--BY--NC--SA--4.0-lightgrey)](https://creativecommons.org/licenses/by-nc-sa/4.0/deed.en)

This project provides a Tensorrt implementation of Dwpose for ultra fast pose estimation inside ComfyUI

This project is licensed under CC BY-NC-SA, everyone is FREE to access, use, modify and redistribute with the same license.

For commercial purposes, please contact me directly at yuvraj108c@gmail.com

If you like the project, please give me a star! ā­


ā±ļø Performance

Note: The following results were benchmarked on FP16 engines inside ComfyUI, using 1000 similar frames

Device FPS
L40s 20

šŸš€ Installation

Navigate to the ComfyUI /custom_nodes directory

git clone https://github.com/yuvraj108c/ComfyUI-Dwpose-Tensorrt
cd ./ComfyUI-Dwpose-Tensorrt
pip install -r requirements.txt

šŸ› ļø Building Tensorrt Engine

  1. Download the following onnx models:

  2. Build tensorrt engines for both of these models by running:

    • python export_trt.py
  3. Place the exported engines inside ComfyUI /models/tensorrt/dwpose directory

ā˜€ļø Usage

šŸ¤– Environment tested

šŸ‘ Credits

License

Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)