THUNLP-MT / THUMT

An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
BSD 3-Clause "New" or "Revised" License
703 stars 197 forks source link

has the latest version added MultiGPU training module? #50

Closed alphadl closed 5 years ago

alphadl commented 5 years ago

That's an important and efficient training trick to speed the training, I saw your teams' paper before which said that would implement multiGPU function soon , however , I can not find any description in your readme file~

Glaceon31 commented 5 years ago

Yes, the device_list parameter allows MultiGPU training

device_list: the list of GPUs to be used in training. Use the nvidia-smi command to nd unused GPUs. If the unused GPUs are gpu0 and gpu1, set this parameter as device_list=[0,1].

Please refer to UserManual.pdf for detailed usage.

alphadl commented 5 years ago

okay, many thanks