Closed mchalecki closed 7 years ago
We'll be happy to help if you need it. We're working on open-sourcing a python notebook soon, so that'd be one thing like that. Even now, you can do t2t_trainer --decode_interactive
and then just pass sentences. Unluckily, this re-loads the model due to tf.learn
limitations, but we're working on that too. Let us know precisely what you need, we'll be happy to help (and if you want if fast, feel free to send PRs of course).
In the new release, we made an example how to use Tensor2Tensor with raw tf.Session and placeholders. That allows to keep model in memory and feed it with batches of data for inference or evaluation as you please. Take a look at the following test for an end-to-end example how to do that: https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/utils/trainer_utils_test.py#L96
Thank you very much. I'll check how it works definitely. At first glance looks great. Keep up great work ;)
@lukaszkaiser trainer_uitls_test.py
has been removed when v1.4 release, instead of `tpu_trainer_lib_test.py`. But it only provide dataset method, lack of raw session.
Is that same with trainer_utils_test.py
?
@efeiefei you could check the function TestModel()
, that is a good example for using raw session, except for its using of default dataset from a pre-registered problem. My suggestion is you could also take a look at https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/bin/t2t_decoder.py , where you could find some useful information in score_file()
Hi. Is it possible or is it planned to build an interface to store model in memory and then decode sentence passed. I want to load model and test it dynamically. Creating a text file, then launching train with loading everything each time is slow and inefficient.