Open J-Cott opened 3 months ago
When Enabling ControlNet on one profile I exported it always fallback
Profile 1 (Use Static Shapes not ticked)
Min Opt Max Height 768 1024 Width 768 1024 Batch Size 1 1 Text-length 75 75 Warning Enabling PyTorch fallback as no engine was found.
on a different profile I built with static shapes option
Profile 0
Min Opt Max Height 1024 1024 1024 Width 1024 1024 1024 Batch Size 1 1 1 Text-length 75 75 75 I exported it with Controlnet ticked I just get this error (even when just enabling the unet without a ControlNet active):
Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument mat1 in method wrapper_CUDA_addmm)
I'm not sure if that it enough to go on, has anyone figured out the setting to get TensorRT to work with ControlNets yet?
We're having the same issue!
Setup: Ubuntu 22.04 LTS PyTorch 2.2.0 Cuda 12.1
Any updates here?
When Enabling ControlNet on one profile I exported it always fallback
Warning Enabling PyTorch fallback as no engine was found.
on a different profile I built with static shapes option
I exported it with Controlnet ticked I just get this error (even when just enabling the unet without a ControlNet active):
Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument mat1 in method wrapper_CUDA_addmm)
I'm not sure if that it enough to go on, has anyone figured out the setting to get TensorRT to work with ControlNets yet?