I specified gpus_id = '0,1', but model just run on first device. I checked the code and couldn't find the "DataParallel" to distribute the model to multiple devices. Please let me know if I am missing something
Hi, @hfarhidzadeh , normally model should be training on multiple gpus if you set the id values correctly, by dividing the data according to the batch_size//no_of_gpus.
Hi,
I specified
gpus_id = '0,1'
, but model just run on first device. I checked the code and couldn't find the "DataParallel" to distribute the model to multiple devices. Please let me know if I am missing something