tensorflow / nmt

TensorFlow Neural Machine Translation Tutorial
Apache License 2.0
6.36k stars 1.96k forks source link

Use model for production #375

Open AlkaSaliss opened 6 years ago

AlkaSaliss commented 6 years ago

Hi, I've trained a nmt model and now would like to use it as part of an application. The provided inference command in the tutorial works well in case of model evaluation during development phase : python -m nmt.nmt \ --out_dir=/tmp/nmt_model \ --inference_input_file=/tmp/my_infer_file.vi \ --inference_output_file=/tmp/nmt_model/output_infer

But it takes a fairly long time to execute because it has to load the model each time the command is executed. My question is how should I modifiy the codebase in order to load the model once and then take strings as input and translate. In other words which files/methods should I modify to include the model in, let's say, a small flask app ? Thanks