Closed 78Alpha closed 1 year ago
Issue appears to be resolved in 1.5 for some reason.
Occurred in the DEV branch as well as V1.4.0. The errors happens with all models, regardless of base. I suspect that it has to do with the ONNX files themselves. Opset 17 runs into this error, Opset 16 parses fine but generates nothing, and so on for every other version.
Running TensorRT 8.6.1.6.
VRAM shouldn't be an issue, running a 3090. Same with system RAM, 48 GB.
[07/18/2023-14:17:05] [E] [TRT] ModelImporter.cpp:732: ERROR: builtin_op_importers.cpp:5428 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?" [07/18/2023-14:17:05] [E] Failed to parse onnx file [07/18/2023-14:17:05] [I] Finished parsing network model. Parse time: 1.98327 [07/18/2023-14:17:05] [E] Parsing model failed [07/18/2023-14:17:05] [E] Failed to create engine from model or file. [07/18/2023-14:17:05] [E] Engine set up failed RuntimeError: Error running command. Command: "H:\Utilities\CUDA\SDDev\stable-diffusion-webui\extensions\stable-diffusion-webui-tensorrt\TensorRT-8.6.1.6\bin\trtexec.exe" --onnx="H:/Utilities/CUDA/SDDev/stable-diffusion-webui/models/Unet-onnx/Anything-V3.0.onnx" --saveEngine="H:\Utilities\CUDA\SDDev\stable-diffusion-webui\models\Unet-trt\Anything-V3.0.trt" --minShapes=x:2x4x64x64,context:2x77x768,timesteps:2 --maxShapes=x:2x4x64x64,context:2x77x768,timesteps:2 --fp16 Error code: 1```
HI I have a question about how you convert sd to onnx. I convert sd model to onnx format but have an error when convert it to tensorrt afterwards.
Occurred in the DEV branch as well as V1.4.0. The errors happens with all models, regardless of base. I suspect that it has to do with the ONNX files themselves. Opset 17 runs into this error, Opset 16 parses fine but generates nothing, and so on for every other version.
Running TensorRT 8.6.1.6.
VRAM shouldn't be an issue, running a 3090. Same with system RAM, 48 GB.