THUNLP-MT / THUMT

An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
BSD 3-Clause "New" or "Revised" License
701 stars 197 forks source link

use cpu to inference #95

Open qpzhao opened 4 years ago

qpzhao commented 4 years ago

Hi, How can I set params or modify "translator.py" for using cpu to inference?

Playinf commented 4 years ago

Unfortunately, the PyTorch implementation currently does not support CPU for inference.

qpzhao commented 4 years ago

Unfortunately, the PyTorch implementation currently does not support CPU for inference.

Thanks. But I use tensorflow implementation. Does tensorlfow implementation support CPU for inference?

GrittyChen commented 4 years ago

@qpzhao Yes, the TensorFlow implementation supports inference with CPU.

GrittyChen commented 3 years ago

Hi, How can I set params or modify "translator.py" for using cpu to inference?

Now, the PyTorch implementation supports CPU for inference. You can add a parameter --cpu to make the translator.py work on the CPU. Note that when you use CPU to inference, you would not be allowed to use the --half parameter.