AUTOMATIC1111 / stable-diffusion-webui-tensorrt

MIT License
311 stars 20 forks source link

How to clear pytorch base model GPU memory? Tensorrt GPU memory larger than pytorch. #45

Open WudiJoey opened 1 year ago

WudiJoey commented 1 year ago

What happened?

When I launch webui, it costs 3G GPU memory at the begining even if i do nothing. And when I use base+lora model, it increases to 5G and decreases to 3G after generating. But when i switch sd-unet to tensort models in [Stable Diffuison] in settings, the hightest memory reach 6G. I guess the 3G is the pytorch memory of base model, so if i only needs trt inference, how do i clear it from my GPU?

Steps to reproduce the problem

run watch -n 1 -d nvidia-smi to watch how gpu memory changes. Go to [Setting] Press [Stable Diffusion], change SD Unet to [TRT models], press [Apply settings] Go to txt2img and test!

What should have happened?

when using trt mode, clear all pytorch gpu memory.

yoinked-h commented 1 year ago

its 4GB minimum for generating