Open jebarpg opened 1 year ago
wait for a while then it' ok
same issue here, also cant change the default maximum width, height, batch size or prompt token count without it spitting out tons of errors.
same issue here, also cant change the default maximum width, height, batch size or prompt token count without it spitting out tons of errors.
I made a fix for all these issues. You can check out my fork with the changes here: https://github.com/jebarpg/stable-diffusion-webui-tensorrt I did all the manual testing and discovered the limits of all the shapes you can create with max width, height and batch sizes. The best batch size and max width and height I have found is bs: 7 maxW: 512 maxH: 512. You can max out the max tokens it has no effect on the shape size limit only the max batch size, max width, and max height have any effect. I also discovered a base number per every batch size from 1 to 11 which lets you know how far you can slide the max width and height. You will get a red label signaling that you are over the limit and green otherwise. I've created a pull request so hopefully it gets integrated in. Also you can do batch processing instead of just one model at a time. for both the onnx files and trt files. NOTE that your settings for max width height batch size tokens etc will apply to the entire batch. Let me know what you all think.
I'm using torch 2.0.1+cu118 and python 3.10.10 when I press "Convert Unet to ONNX" I get the following output: