NVIDIA / Stable-Diffusion-WebUI-TensorRT

TensorRT Extension for Stable Diffusion Web UI
MIT License
1.84k stars 139 forks source link

[Feature Request] Add a Command Line Interface for the exporter process. #320

Open fcolecumberri opened 2 months ago

fcolecumberri commented 2 months ago

At this point, by looking at the ReadMe and the Issues of these project it's quite obvious that there are a lot of people having trouble exporting the models (myself included). However the reason for this seems to be varied, sometimes it's because a flag on the webui as the Readme describes:

https://github.com/NVIDIA/Stable-Diffusion-WebUI-TensorRT/blob/b75f5604a7e4534899c720ef8db5e15042b2b9f9/README.md?plain=1#L52

Sometimes this only works on some forks of webui https://github.com/NVIDIA/Stable-Diffusion-WebUI-TensorRT/issues/262#issuecomment-2037059915

I am willing to bet this sometimes this does not work depending on some other plugin that does something that just messes with this.

Also sometimes the process crashes on a cuda out of memory as seen https://github.com/NVIDIA/Stable-Diffusion-WebUI-TensorRT/issues/175

I think it's quite clearly that if the exporting process had a Command Line Interface that could be called so only the exporting process would running without interference, then (hopefully) most of this problems should disappear.

Also in terms of bug reports, people could report issues describing what happen when the CLI is failing, which would only involve the TensorRT process (hopefully) reducing the complexity of solving bugs.