Closed mooncakehub closed 7 months ago
You mentioned in the article that RTX 3060 and RTX3090,Are they used at the same time. how much cuda memory is needed
No. Use 3060 to train model with 64x512. Use 3090 to train model with 64x1024 & 64x2048. For both 64x512 and 64x2048, our batch size make full use of all GPU memory.
You mentioned in the article that RTX 3060 and RTX3090,Are they used at the same time. how much cuda memory is needed