NVIDIA / TensorRT

NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
https://developer.nvidia.com/tensorrt
Apache License 2.0
10.15k stars 2.08k forks source link

ERROR: root:Exporting to ONNX failed #3958

Open dezorianguy opened 1 week ago

dezorianguy commented 1 week ago

I am trying to use Nvidia TensorRT within my Stable Diffusion Forge environment.

I use Stability Matrix for my Stable Diffusion programs and installation of models. In Forge, I installed the TensorRT extension, enabled sd unet in the interface, and when I try to export an engine for a model, I get the following errors in the command screen:

ERROR:root:Exporting to ONNX failed. module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention' Building TensorRT engine... This can take a while, please check the progress in the terminal. Building TensorRT engine for C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx: C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-trt\autismmixSDXL_autismmixConfetti_10047b0e_cc86_sample=2x4x128x128+2x4x128x128+2x4x128x128-timesteps=2+2+2-encoder_hidden_states=2x77x2048+2x77x2048+2x77x2048-y=2x2816+2x2816+2x2816.trt Could not open file C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx Could not open file C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx [W] 'colored' module is not installed, will not use colors when logging. To enable colors, please install the 'colored' module: python3 -m pip install colored [E] ModelImporter.cpp:773: Failed to parse ONNX model from file: C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx [!] Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model? Environment

NVIDIA GPU: GeForce RTX 3060 12GB

NVIDIA Driver Version: latest

CUDA Version: 12.1

Operating System: Windows 11

Python Version: 3.10

PyTorch Version: torch-2.3.1+cu121, torchaudio-2.3.1+cu121, torchvision-0.18.1+cu121, xformers-0.0.26.post1

Baremetal or Container: Baremetal

Relevant Files Model link: AutismMixXL Conffetti

Steps To Reproduce Commands or scripts:

Install TensorRT extension in Stable Diffusion Forge. Enable sd unet in the interface. Try to export an engine for the model.

Have you tried the latest release?: Yes

Can this model run on other frameworks?: Have not tried running the ONNX model with ONNXRuntime (polygraphy run --onnxrt).

I am looking for assistance in resolving these errors to successfully export and run the TensorRT engine.

lix19937 commented 1 week ago

[E] ModelImporter.cpp:773: Failed to parse ONNX model from file: C:\StabilityMatrix\Data\Packages\Stable Diffusion WebUI Forge\models\Unet-onnx\autismmixSDXL_autismmixConfetti.onnx [!] Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?

First use polygraphy to check and optimize onnx

     polygraphy surgeon sanitize model.onnx --fold-constants --output model_folded.onnx

Then use trtexec --verbose ... to get full log.