python3 convert_unet.py --ckpt_path ../../models/checkpoints/sd_xl_base_1.0_0.9vae.safetensors
Total VRAM 8192 MB, total RAM 48178 MB
xformers version: 0.0.23.post1
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 2070 : native
VAE dtype: torch.float32
Using xformers cross attention
model_type EPS
adm 2816
detected baseline model version: SDXL
Exporting sd_xl_base_1.0_0.9vae.safetensors to TensorRT
[I] size & shape parameters:
batch size: min=1, opt=1, max=1
height: min=768, opt=1024, max=1024
width: min=768, opt=1024, max=1024
token count: min=75, opt=75, max=150
/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/../../comfy/ldm/modules/diffusionmodules/openaimodel.py:841: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
assert y.shape[0] == x.shape[0]
/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/../../comfy/ldm/modules/diffusionmodules/openaimodel.py:122: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
assert x.shape[1] == self.channels
/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/../../comfy/ldm/modules/attention.py:289: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if b heads > 65535:
/home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:178: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
self.query.shape == (B, Mq, K)
/home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:179: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
and self.key.shape == (B, Mkv, K)
/home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:180: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
and self.value.shape == (B, Mkv, Kv)
/home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:277: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if not cls.SUPPORTS_DIFFERENT_VALUE_EMBED and K != Kv:
/home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:279: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if max(K, Kv) > cls.SUPPORTED_MAX_K:
/home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:540: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if x.shape[-1] % alignment != 0:
ERROR:root:Exporting to ONNX failed. unsupported output type: int, from operator: xformers::efficient_attention_forward_cutlass
Building TensorRT engine… This can take a while.
Building TensorRT engine for /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx: /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-trt/sd_xl_base_1.0_0.9vae_387ffbda0547d0a571e2d78607b1ccab.trt
Could not open file /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx
Could not open file /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx
[E] ModelImporter.cpp:733: Failed to parse ONNX model from file: /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx
[!] Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?
Traceback (most recent call last):
File "/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/convert_unet.py", line 141, in
ret = export_trt(trt_path, onnx_path, timing_cache, profile=profile, use_fp16=not args.float32)
File "/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/exporter.py", line 157, in export_trt
ret = engine.build(onnx_path, use_fp16, enable_refit=True, timing_cache=timing_cache, input_profile=[profile])
File "/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/utilities.py", line 151, in build
network = network_from_onnx_path(onnx_path, flags=[trt.OnnxParserFlag.NATIVE_INSTANCENORM])
File "", line 3, in network_from_onnx_path
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/backend/base/loader.py", line 40, in call
return self.call_impl(args, *kwargs)
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/util/util.py", line 710, in wrapped
return func(args, **kwargs)
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/backend/trt/loader.py", line 227, in call_impl
trt_util.check_onnx_parser_errors(parser, success)
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/backend/trt/util.py", line 86, in check_onnx_parser_errors
G_LOGGER.critical("Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?")
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/logger/logger.py", line 605, in critical
raise ExceptionType(message) from None
polygraphy.exception.exception.PolygraphyException: Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?
I've tried tensorrt v9 and v8. The dependencies are all there.
python3 convert_unet.py --ckpt_path ../../models/checkpoints/sd_xl_base_1.0_0.9vae.safetensors Total VRAM 8192 MB, total RAM 48178 MB xformers version: 0.0.23.post1 Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 2070 : native VAE dtype: torch.float32 Using xformers cross attention model_type EPS adm 2816
detected baseline model version: SDXL Exporting sd_xl_base_1.0_0.9vae.safetensors to TensorRT [I] size & shape parameters:
/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/../../comfy/ldm/modules/diffusionmodules/openaimodel.py:841: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! assert y.shape[0] == x.shape[0] /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/../../comfy/ldm/modules/diffusionmodules/openaimodel.py:122: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! assert x.shape[1] == self.channels /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/../../comfy/ldm/modules/attention.py:289: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if b heads > 65535: /home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:178: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! self.query.shape == (B, Mq, K) /home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:179: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! and self.key.shape == (B, Mkv, K) /home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:180: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! and self.value.shape == (B, Mkv, Kv) /home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:277: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if not cls.SUPPORTS_DIFFERENT_VALUE_EMBED and K != Kv: /home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:279: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if max(K, Kv) > cls.SUPPORTED_MAX_K: /home/wsluser/.local/lib/python3.10/site-packages/xformers/ops/fmha/common.py:540: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if x.shape[-1] % alignment != 0: ERROR:root:Exporting to ONNX failed. unsupported output type: int, from operator: xformers::efficient_attention_forward_cutlass Building TensorRT engine… This can take a while. Building TensorRT engine for /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx: /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-trt/sd_xl_base_1.0_0.9vae_387ffbda0547d0a571e2d78607b1ccab.trt Could not open file /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx Could not open file /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx [E] ModelImporter.cpp:733: Failed to parse ONNX model from file: /home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/Unet-onnx/sd_xl_base_1.0_0.9vae.onnx [!] Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model? Traceback (most recent call last): File "/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/convert_unet.py", line 141, in
ret = export_trt(trt_path, onnx_path, timing_cache, profile=profile, use_fp16=not args.float32)
File "/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/exporter.py", line 157, in export_trt
ret = engine.build(onnx_path, use_fp16, enable_refit=True, timing_cache=timing_cache, input_profile=[profile])
File "/home/wsluser/ComfyUI/custom_nodes/comfy-trt-test/comfy_trt/utilities.py", line 151, in build
network = network_from_onnx_path(onnx_path, flags=[trt.OnnxParserFlag.NATIVE_INSTANCENORM])
File "", line 3, in network_from_onnx_path
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/backend/base/loader.py", line 40, in call
return self.call_impl( args, *kwargs)
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/util/util.py", line 710, in wrapped
return func(args, **kwargs)
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/backend/trt/loader.py", line 227, in call_impl
trt_util.check_onnx_parser_errors(parser, success)
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/backend/trt/util.py", line 86, in check_onnx_parser_errors
G_LOGGER.critical("Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?")
File "/home/wsluser/.local/lib/python3.10/site-packages/polygraphy/logger/logger.py", line 605, in critical
raise ExceptionType(message) from None
polygraphy.exception.exception.PolygraphyException: Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?
I've tried tensorrt v9 and v8. The dependencies are all there.