THUNLP-MT / THUMT

An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
BSD 3-Clause "New" or "Revised" License
703 stars 197 forks source link

The model transformer support multi-GPU whether or not ? #29

Closed XiaoqingNLP closed 6 years ago

XiaoqingNLP commented 6 years ago

I have try to reproduce the experiment with model transformer and multi-GPU ,but I found a lots of line is none in the file *trans.norm of decoding .the experiment model rnnsearch with one-gpu hasn't this phenomenon. tensorflow/ master

Playinf commented 6 years ago

I think you probably changed the default parameters of transformer architecture, which is quite sensitive to hyper parameters. All models of THUMT support multi-GPU setting and the transformer architecture generally performs much better than rnnsearch when using default parameters.