Closed rafikg closed 2 years ago
script/
now (my mistake). CUDA_VISIBLE_DEVICES=1 python train.py -config xxx.yml
. And set gpu_ranks to 0.
Similarly, if you'd like to use 2 GPUs, just set world_size=2, gpu_ranks=0,1 and run CUDA_VISIBLE_DEVICES=1,2 python train.py -config xxx.yml
.Hope it helps.
Rui
Thanks @memray . I have already fix it.
Hi @memray , 1) just start with your code. I found that config/transfer_kp/train does not exist! I created it and copied
transformer-presabs-kp20k.yml
from here. Is it the same?python train.py -config config/transfer_kp/train/transformer-presabs-kp20k.yml
2) I run the code on a multi-gpu machine and I want to use only 1 (e.g. GPU number 1), I specified gpu_ranks: 1 but It does not work. I checked and I have to set the world_size=2>gpu_ranks. Is
gpu rank = gpu id ?
andworld_size = number of available gpu on the machine ?