Hello, I use 8 2080ti GPUs and select swin transformer-tiny as backbone to train the net. It takes me almost 60 hours to complete 40 training epochs. I wonder if this is normal?
If it is normal, what's the main reason for the slow training speed? I try to discard the PWAM, but it is still slow. I think 60 hours is a long time for one-time training.
Looking forward to your reply.
Hello, I use 8 2080ti GPUs and select swin transformer-tiny as backbone to train the net. It takes me almost 60 hours to complete 40 training epochs. I wonder if this is normal? If it is normal, what's the main reason for the slow training speed? I try to discard the PWAM, but it is still slow. I think 60 hours is a long time for one-time training. Looking forward to your reply.