Open xiangxu007 opened 2 weeks ago
Hello Xu,
Thank you for your interest in our Diffree.
For training with a batch size of 16, each GPU requires approximately 40 GB of memory. The GPU memory requirement is closely related to the batch size you choose. For inference, unfortunately, 6 GB of GPU memory is still insufficient. You would need around 16 GB of GPU memory to run inference smoothly.
Please let me know if you have any other questions.
Best regards, Lirui
Hello, Could you tell me how much GPU memory is needed for training/testing, I only have 6G GPU memory, I'm guessing there isn't enough GPU memory for training/testing. Best. Xu.