AUTOMATIC1111 / stable-diffusion-webui-tensorrt

MIT License
311 stars 20 forks source link

Add Batch mode + Limit calculation #36

Open wizz13150 opened 1 year ago

wizz13150 commented 1 year ago

Hey,

Repost of the previous messy PRs.

A batch mode would be appreciated.

Obviously not a python expert, lol.

Convert everything in 5-8 clicks:

This way, using default folders, it will convert all the files in the 'Stable-Diffusion' folder to .onnx in the 'Unet-onnx' folder, and then automatically start to convert to .trt in the 'Unet_trt' folder. Kind of queued. And Voilà ! The 'Unet-onnx' and 'Unet-trt' folders are now populated with all the available .ckpt and .safetensors models in the 'Stable-Diffusion' folder.

Screen for the 'Convert to ONNX' tab :

image

image

Screen for the 'Convert ONNX to TensorRT' tab :

image image

image

I mean, it worked for me, lol.

Cheers ! 🥂

wizz13150 commented 1 year ago

The only thing Still ToDo (as i can tell) :

image

Not sure yet how to take this one, tho. 🤨🤔I won't touch this, I just set 'None' as 'SD Unet' setting, to un-link during conversion.

image

I can see people here and there able to convert to trt using 640*640 at batch 2, i don't understand this, as it exceed the limit for me. If anyone have an answer for this. Example, using a rtx4070ti, 12gb vram, result = 32.3 it/s : https://www.youtube.com/watch?v=bT7SaMkgNEY&t=505s