cmusphinx / g2p-seq2seq

G2P with Tensorflow
Other
667 stars 196 forks source link

Softmax not in graph #144

Closed fbkarsdorp closed 6 years ago

fbkarsdorp commented 6 years ago

After training a model, I tried to freeze it, but I keep running into the following error:

AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph

Any idea what might be the issue here?

I'm using a fresh clone of g2p and and TF 1.9.0.

nshmyrev commented 6 years ago

Any idea what might be the issue here?

Tensorflow guys changed their API again. They keep changing the API every month. This is why we are moving away from TF.

fbkarsdorp commented 6 years ago

OK. Thanks. So which version should I use?

nshmyrev commented 6 years ago

Try 1.8

fbkarsdorp commented 6 years ago

I've tried, and unfortunately it didn't work, nor with 1.5. Care to reopen this?

nshmyrev commented 6 years ago

What is your tensor2tensor version?

ShihabYasin commented 4 years ago

Getting error similar. My tensor2tensor is 1.7.0. Is there any solution?

nshmyrev commented 4 years ago

t2t must be 1.8, not 1.5 not 1.6, not 1.7, not 1.9. 1.8

ShihabYasin commented 4 years ago

Checked. But got this error after this command g2p-seq2seq --model_dir model --freeze

Use tf.compat.v1.graph_util.extract_sub_graph Traceback (most recent call last): File "/usr/local/bin/g2p-seq2seq", line 11, in <module> load_entry_point('g2p-seq2seq==6.2.2a0', 'console_scripts', 'g2p-seq2seq')() File "/usr/local/lib/python2.7/dist-packages/g2p_seq2seq-6.2.2a0-py2.7.egg/g2p_seq2seq/app.py", line 120, in main g2p_model.freeze() File "/usr/local/lib/python2.7/dist-packages/g2p_seq2seq-6.2.2a0-py2.7.egg/g2p_seq2seq/g2p.py", line 394, in freeze variable_names_blacklist=['global_step']) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/deprecation.py", line 324, in new_func return func(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/graph_util_impl.py", line 245, in convert_variables_to_constants inference_graph = extract_sub_graph(input_graph_def, output_node_names) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/deprecation.py", line 324, in new_func return func(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/graph_util_impl.py", line 181, in extract_sub_graph _assert_nodes_are_present(name_to_node, dest_nodes) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/graph_util_impl.py", line 137, in _assert_nodes_are_present assert d in name_to_node, "%s is not in graph" % d AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph

dpny518 commented 4 years ago

Does not work with

tensor2tensor            1.8.0       
tensorboard              1.13.1      
tensorflow-estimator     1.13.0      
tensorflow-gpu           1.13.1 

same error

AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph