stochasticai / x-stable-diffusion

Real-time inference for Stable Diffusion - 0.88s latency. Covers AITemplate, nvFuser, TensorRT, FlashAttention. Join our Discord communty: https://discord.com/invite/TgHXuSJEk6
https://stochastic.ai
Apache License 2.0
553 stars 35 forks source link

tensorrt conversion fails #19

Closed harishprabhala closed 1 year ago

harishprabhala commented 2 years ago

Hi. I am trying to convert my onnx file to Tensorrt using the conver_unet_to_tensorrt.py. But i get the below error

Traceback (most recent call last):
  File "convert_unet_to_tensorrt.py", line 52, in <module>
    convert(args)
  File "convert_unet_to_tensorrt.py", line 47, in convert
    f.write(serialized_engine)
TypeError: a bytes-like object is required, not 'NoneType' 

Except for the onnx file mentioned in the library, none of the other onnx files seem to convert. Any fix? Thanks.

Toan-Do commented 2 years ago

Hi @harishprabhala, are you using Tensorrt 8.4.2.2.4? Our code requires TensorRT 8.4.2.2.4 to run.

harishprabhala commented 2 years ago

I am using 8.4.1.5. I'll update it to the new version and try again. Will keep you posted.

harishprabhala commented 2 years ago

I have tried it in 8.4.2 and I get the following error in demo.py

WhatsApp Image 2022-11-10 at 5 40 56 PM

harishprabhala commented 2 years ago

If I comment out the line 144, the denoising isn't happening and this is the output of the demo.py output

WhatsApp Image 2022-11-10 at 4 23 11 PM

Toan-Do commented 2 years ago

@harishprabhala It seems that the conversion on your environment did not work well. How about trying on google colab : https://github.com/stochasticai/x-stable-diffusion/blob/main/TensorRT/Notebook.ipynb

harishprabhala commented 2 years ago

@harishprabhala It seems that the conversion on your environment did work well. How about trying on google colab : https://github.com/stochasticai/x-stable-diffusion/blob/main/TensorRT/Notebook.ipynb

Conversion "did" work well or "didn't"? :) So, if the conversion went well, why are the predictions turning out this way? Re Colab, how will it be different? Colab has its own set of issues. The Git clone doesn't work properly and there's no terminal access. Any way we can fix it in a local environment? I am testing it on a 2080Ti GPU.

Toan-Do commented 2 years ago

We did not test on 2080ti GPU. We tested on a100 and T4 gpu. Google colab has t4 gpu and thus we suggest testing on google colab or on machines which have a100 or T4 gpu.

harishprabhala commented 2 years ago

This is the output now on T4. This is becoming impossible 😅

download

Toan-Do commented 1 year ago

Hi @harishprabhala , sorry for late response. We have just tested TensorRT code again on Google Colab and it works without any issue. Our tested notebook here: link. If you are still interested in, please test it again and let us know if any issue. Thank you.

ClementCJ commented 1 year ago

@Toan-Do The output on my local T4 is like this image_generated

I also ran it on Google Colab , but it says the following error. NotImplementedError: A UTF-8 locale is required. Got ANSI_X3.4-1968

Toan-Do commented 1 year ago

@ClementCJ On Google Colab, after converting model, you should restart the runtime and run start running from this line: https://colab.research.google.com/drive/1MAwk-DAngujQTaYh1eFZoOuocOhiKsge?usp=sharing#scrollTo=hLt_F8vpVGjl