I have successfully performed training on a 16GB T4 GPU. But when I train with 2 Kaggle T4 GPUs, the training process still runs and no errors are logged. But I encounter a problem: the training time when using 2 GPUs does not decrease compared to 1 GPU but also increases up to 50 minutes when training on 2 GPUs.
Can you tell me what error is happening where? Thank you
I have successfully performed training on a 16GB T4 GPU. But when I train with 2 Kaggle T4 GPUs, the training process still runs and no errors are logged. But I encounter a problem: the training time when using 2 GPUs does not decrease compared to 1 GPU but also increases up to 50 minutes when training on 2 GPUs.
Can you tell me what error is happening where? Thank you