cmusphinx / g2p-seq2seq

G2P with Tensorflow
Other
670 stars 194 forks source link

Freeze model doesn't work #149

Open glprophet opened 6 years ago

glprophet commented 6 years ago

Hi,

I'm trying to freeze a model I trained. It fails with the message can't find a node "transformer/parallel_0_5/transformer/body/decoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "Softmax" Your output_node_names are wrong. Could you please fix it or at least to publish a list of correct nodes I can use.

Thanks, glprophet

eliaho commented 5 years ago

did anyone manged to solve this issue?

luduling commented 5 years ago

env: Tensorflow 1.12.0 Tensor2Tensor 1.7 modify g2p.py the line output_node_names = ["transformer/parallel_0_5/transformer/body/decoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "Softmax"...] To output_node_names = ["transformer/parallel_0_4/transformer/transformer/body/encoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/encoder/" "layer_1/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/encoder/" "layer_2/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_0/encdec_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_1/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_1/encdec_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_2/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_2/encdec_attention/multihead_attention/dot_product_attention/" "attention_weights"]

Remember to reinstall by command: python setup.py install

Done!

glprophet commented 5 years ago

Hi,

Thanks for fixing it. I can freeze the graph now. However, in order to use the frozen graph in production, the graph should contain placeholder(s). I guess you should add an input placeholder node to the graph before freezing it. Could you please fix it or post a code snippet of how it can be done?

Thanks, glprophet