Hi,
Could you please advise what is the normal inference run time and GPU memory required for upper/lower/full body modes?
I am running on A100 80GB in GoogleColab and it takes over 4 minutes to complete for the lower body (parameters below). Is this expected performance? Trying to understand where the issue is.
--model_type dc --category 1 --scale 2.0 --sample 1
For the full body mode the 80GB of GPU is not enough to execute inference. How much memory does this mode require?
Hi. Both half and full body models take less than 7GB GPU memory for 1 sample inference. Please try https://huggingface.co/spaces/levihsu/OOTDiffusion to see the inference time consumption on A100 40GB.
Hi, Could you please advise what is the normal inference run time and GPU memory required for upper/lower/full body modes? I am running on A100 80GB in GoogleColab and it takes over 4 minutes to complete for the lower body (parameters below). Is this expected performance? Trying to understand where the issue is. --model_type dc --category 1 --scale 2.0 --sample 1
For the full body mode the 80GB of GPU is not enough to execute inference. How much memory does this mode require?
Thank you