Open hee-dongdong opened 4 days ago
I have 6 4090 GPUs (VRAM = 120GB). However, when I try to finetune the model, it shows "CUDA out of memory" error. How much VRAM is needed to train the ViT backbone model? I want to know how many GPUs you had when you pretrain the model.
I have 6 4090 GPUs (VRAM = 120GB). However, when I try to finetune the model, it shows "CUDA out of memory" error. How much VRAM is needed to train the ViT backbone model? I want to know how many GPUs you had when you pretrain the model.