Open Amuseum-WHR opened 9 months ago
How long is the entire inference time? I found Stage I very slow on 3090 and adding gpus cannot accelerate the process
The whole process would take about 2.5 hours on a single 40G A100. Alternatively, you may consider trying a faster version here, which lessens the processing time to around 40 minutes.
How long is the entire inference time? I found Stage I very slow on 3090 and adding gpus cannot accelerate the process