Open yejr0229 opened 2 weeks ago
Great question. I used 4 A100-40G GPUs with a batch size of 3. I haven't tried training on a single GPU, but I believe the current implementation supports small batch sizes such as 1 or 2.
yejr - did you get it to work on RTX3090
Hi, I'd like to know how much memory is required for LaRa? An A100 or just RTX3090 with 24GB will be enough?