Open azmat21 opened 5 years ago
when max vocab size this very big (exp, --max_vocab=1000000 or 1500000), single gpu thorws cuda memory error, how can I run with multi GPUs?
when max vocab size this very big (exp, --max_vocab=1000000 or 1500000), single gpu thorws cuda memory error, how can I run with multi GPUs?