Open alsora opened 6 years ago
Yes, we wish to fix this problem. The problem arises because we want iteratively call the decode_word() method and transmit to it the initial word. But, in tensor2tesnor implementation of the interactive decoding, they create input_fn() and transmit it to the tf.estimator.
Fixed with commit: a1e5722bf5545beae3f7051623ca03f5b7a89793
@nurtas-m confirmed that it works now! thanks a lot that's a important fix 💯
Decode word is still broken, @nurtas-m this is extremely important and extremely high priority issue. Please do not commit half-fixes.
Hi,
Thank you for your this very nice repository, I have been able to easily perform training and test.
However I noticed a strange behavior when performing inference in interactive mode. My goal would be to load a frozen model and then perform a decoding once in a while.
I have been able to freeze a trained model and then it is immediately recognized by the module. Then this is the code which is executed in interactive mode:
The problem is that the method "self.decode_word" load the graphs again, resulting in a very slow inference.
Do you have any idea about how to fix this problem?
Thank you