cmusphinx / g2p-seq2seq

G2P with Tensorflow
Other
670 stars 194 forks source link

Frozen graph in interactive mode reloaded every time #120

Open alsora opened 6 years ago

alsora commented 6 years ago

Hi,

Thank you for your this very nice repository, I have been able to easily perform training and test.

However I noticed a strange behavior when performing inference in interactive mode. My goal would be to load a frozen model and then perform a decoding once in a while.

I have been able to freeze a trained model and then it is immediately recognized by the module. Then this is the code which is executed in interactive mode:

   if os.path.exists(self.frozen_graph_filename):
      with tf.Session(graph=self.graph) as sess:
      inp = tf.placeholder(tf.string, name="inp_decode")[0]
      decode_op = tf.py_func(self.decode_word, [inp], tf.string)
      while True:
          word = get_word()
          result = self.__run_op(sess, decode_op, word)
          print ("output: " + result)

The problem is that the method "self.decode_word" load the graphs again, resulting in a very slow inference.

INFO:tensorflow:Restoring parameters from data/models/cmudict/model.ckpt-200000
[2018-05-06 13:45:09,823] Restoring parameters from data/models/cmudict/model.ckpt-200000

Do you have any idea about how to fix this problem?

Thank you

nurtas-m commented 6 years ago

Yes, we wish to fix this problem. The problem arises because we want iteratively call the decode_word() method and transmit to it the initial word. But, in tensor2tesnor implementation of the interactive decoding, they create input_fn() and transmit it to the tf.estimator.

nurtas-m commented 6 years ago

Fixed with commit: a1e5722bf5545beae3f7051623ca03f5b7a89793

loretoparisi commented 6 years ago

@nurtas-m confirmed that it works now! thanks a lot that's a important fix 💯

nshmyrev commented 6 years ago

Decode word is still broken, @nurtas-m this is extremely important and extremely high priority issue. Please do not commit half-fixes.