Open Dylandtt opened 1 year ago
ERROR: Could not consume arg: --local-rank=1 ERROR: Could not consume arg: --local-rank=0
Hi @Dylandtt,
Please use WORLD_SIZE=8 python -m torch.distributed.run --nproc_per_node=8 train.py
for multi-GPU training (change 8 to the number of GPUs you want to use).
If you have any further questions or need additional information, please feel free to ask.
How to Use Multi-GPU Training