chaiNNer-org / chaiNNer

A node-based image processing GUI aimed at making chaining image processing tasks easy and customizable. Born as an AI upscaling application, chaiNNer has grown into an extremely flexible and powerful programmatic image processing application.
https://chaiNNer.app
GNU General Public License v3.0
4.44k stars 278 forks source link

Add TensorRT support #916

Closed C43H66N12O12S2 closed 1 year ago

C43H66N12O12S2 commented 2 years ago

Motivation TensorRT provides transformative speedups, usually in the realm of more than 3x, compared to regular PyTorch. This is useful for large architectures - such as SwinIR - that consume far more time than the usual (Real)ESRGAN. SwinIR, on a 3080 12GB, requires 700s for a 4x upscale on my machine.

Description Engine building & inference support for TensorRT. chaiNNer already features ONNX support, so it should be easier than starting from scratch. The ideal support would be one that enables PyTorch model -> Engine Building -> Save Engine and/or Inference.

joeyballentine commented 2 years ago

I have considered this before. The issue is that TensorRT is currently in a weird state where it's difficult to install, and things like Torch-TensorRT have not-so-great support (for example having no pre-built windows wheels). TensorRT support for ONNX would be really easy as it's already included in onnxruntime-gpu, but the user needs to have an Nvidia developer account and manually set up the TensorRT path variables and everything in order for it to work.

So, it's possible. But, I'd rather wait and support it when TensorRT is a better user and developer experience. Though, maybe I could have a settings toggle for ONNX TRT for users that know what they're doing and have their environment variables set up properly to use it.

C43H66N12O12S2 commented 2 years ago

So, it's possible. But, I'd rather wait and support it when TensorRT is a better user and developer experience. Though, maybe I could have a settings toggle for ONNX TRT for users that know what they're doing and have their environment variables set up properly to use it.

Personally, I would appreciate this feature. I already have a NVIDIA Developer account - and making one is easy, as there is no application and no fees, so anybody could have one - and would like to improve SwinIR's glacial speed.

0x4E69676874466F78 commented 1 year ago

I would try TensorRT on Windows if it gave me a real performance boost.

joeyballentine commented 1 year ago

TensorRT support has been added as of v0.12.5 via ONNX. Set it up manually and it should appear in your execution provider dropdown in settings.

zelenooki87 commented 1 year ago

hi, @C43H66N12O12S2 Could you please share Swinir Large x4 model in TensorRT format? Thank you very much!