levihsu / OOTDiffusion

Official implementation of OOTDiffusion: Outfitting Fusion based Latent Diffusion for Controllable Virtual Try-on
Other
5.45k stars 801 forks source link

Inference run time and resources #159

Closed dimeetrius closed 5 months ago

dimeetrius commented 5 months ago

Hi, Could you please advise what is the normal inference run time and GPU memory required for upper/lower/full body modes? I am running on A100 80GB in GoogleColab and it takes over 4 minutes to complete for the lower body (parameters below). Is this expected performance? Trying to understand where the issue is. --model_type dc --category 1 --scale 2.0 --sample 1

For the full body mode the 80GB of GPU is not enough to execute inference. How much memory does this mode require?

Thank you

levihsu commented 5 months ago

Hi. Both half and full body models take less than 7GB GPU memory for 1 sample inference. Please try https://huggingface.co/spaces/levihsu/OOTDiffusion to see the inference time consumption on A100 40GB.