tensorflow / tensor2tensor

Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Apache License 2.0
15.5k stars 3.49k forks source link

Help for decoding after loading the model #1099

Open sugeeth14 opened 6 years ago

sugeeth14 commented 6 years ago

Hello I am trying the translate_ende_wmt32k problem given in the walk-through. Trained the model and during inference I don't want to decode file on a whole but want to decode sentence by sentence as they come so that I load the model only once and wait for input to decode them as they come. Any help on how it could be done ? Thanks.

martinpopel commented 6 years ago

You can try --decode_interactive https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/bin/t2t_decoder.py#L54 (or t2t query server).