Open ddddqt opened 10 months ago
Hello author, I am now trying to use two Gpus for distributed training, but I do not know why I have been staying here and did not start training, may I ask you to encounter such a situation?
I have not encountered this issue. If you're training JHMDB, one GPU is ok, it should not take long to train.
Hello author, I am now trying to use two Gpus for distributed training, but I do not know why I have been staying here and did not start training, may I ask you to encounter such a situation?