Open glprophet opened 6 years ago
did anyone manged to solve this issue?
env: Tensorflow 1.12.0 Tensor2Tensor 1.7 modify g2p.py the line output_node_names = ["transformer/parallel_0_5/transformer/body/decoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "Softmax"...] To output_node_names = ["transformer/parallel_0_4/transformer/transformer/body/encoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/encoder/" "layer_1/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/encoder/" "layer_2/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_0/encdec_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_1/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_1/encdec_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_2/self_attention/multihead_attention/dot_product_attention/" "attention_weights", "transformer/parallel_0_4/transformer/transformer/body/decoder/" "layer_2/encdec_attention/multihead_attention/dot_product_attention/" "attention_weights"]
Remember to reinstall by command: python setup.py install
Done!
Hi,
Thanks for fixing it. I can freeze the graph now. However, in order to use the frozen graph in production, the graph should contain placeholder(s). I guess you should add an input placeholder node to the graph before freezing it. Could you please fix it or post a code snippet of how it can be done?
Thanks, glprophet
Hi,
I'm trying to freeze a model I trained. It fails with the message can't find a node "transformer/parallel_0_5/transformer/body/decoder/" "layer_0/self_attention/multihead_attention/dot_product_attention/" "Softmax" Your output_node_names are wrong. Could you please fix it or at least to publish a list of correct nodes I can use.
Thanks, glprophet