Closed HeChengHui closed 1 year ago
I used two 3090ti GPUs to train the MP-SENet model, each with a memory size of 24 GB.
@yxlu-0102 Is it the same for inference?
One GPU with a memory size of 24 GB is enough for the inference process.
@yxlu-0102 I just tried inferencing on my 3090ti (24gb vram), and I am still getting OOM error. Is there a way to reduce the memory usage?
Due to the variable length of speech, the memory consumption for generating each speech is not fixed, and it doesn't depend on whether the GPU is a 3090 or not. It primarily depends on the amount of GPU memory you have.
During training, you can reduce GPU memory usage by adjusting the batch size and segment length, but this may have an impact on the training results.
@yxlu-0102 I would like to check what is the minimum vram to run this model. I have a 2070Super and running into OOM error.