Closed zhiyuanyou closed 2 years ago
About torch.distributed.launch: sh train_torch.sh #NUM_GPUS. It works well with 1 GPU. However, it shows RuntimeError: Address already in use with multiple GPUs.
sh train_torch.sh #NUM_GPUS
RuntimeError: Address already in use
About torch.distributed.launch:
sh train_torch.sh #NUM_GPUS
. It works well with 1 GPU. However, it showsRuntimeError: Address already in use
with multiple GPUs.