Open AmitShashi opened 1 year ago
y, tho
Time taken: 15.22s Torch active/reserved: 6179/7342 MiB, Sys VRAM: 8617/12020 MiB (71.69%)
ok so i guess i was not able to explain my problem clearly
one time i installed a lot of extension and i though my webui is slow. so inorder to find cause i saw a lot of vram uses in task manager. i though maybe the extension i installed is using my vram. but i had no vram left for image generation. i removed all my extension and vram got free by half. was it extension or was it model loaded that was using vram. if i could find a extension which is using vram then i could remove that and then i could easily use free vram for image generation
if extension is using vram then i will remove that extension. if torch is using vram then i will unload model. knowing this info can save my time.
sorry for my bad English. i hope i was able to explain what i want.. if u have any question please ask me.
Is there an existing issue for this?
What would your feature do ?
i want an feature or extension to keep track of GPU cuda vram usages. i can see how much vram is being used in task manager but i can see what process inside SD Web-UI is using that vram.
so ideally gpu cuda vram memory should only be used when image is being generated or trained. but stable diffusion web UI uses cuda memory even when nothing is being done.
Proposed workflow
one way to solve this problem is to "Unload SD checkpoint to free VRAM" by on button in action tab of setting of web UI. but i want a feature or extension, which can tell me exactly which process or extension or library or API or webui feature or model is using the cuda vram memory.
Additional information
doing so will help me in taking better future decision like which extension to keep which model to use etc.