Closed fbkarsdorp closed 6 years ago
Any idea what might be the issue here?
Tensorflow guys changed their API again. They keep changing the API every month. This is why we are moving away from TF.
OK. Thanks. So which version should I use?
Try 1.8
I've tried, and unfortunately it didn't work, nor with 1.5. Care to reopen this?
What is your tensor2tensor version?
Getting error similar. My tensor2tensor is 1.7.0. Is there any solution?
t2t must be 1.8, not 1.5 not 1.6, not 1.7, not 1.9. 1.8
Checked. But got this error after this command g2p-seq2seq --model_dir model --freeze
Use tf.compat.v1.graph_util.extract_sub_graph Traceback (most recent call last): File "/usr/local/bin/g2p-seq2seq", line 11, in <module> load_entry_point('g2p-seq2seq==6.2.2a0', 'console_scripts', 'g2p-seq2seq')() File "/usr/local/lib/python2.7/dist-packages/g2p_seq2seq-6.2.2a0-py2.7.egg/g2p_seq2seq/app.py", line 120, in main g2p_model.freeze() File "/usr/local/lib/python2.7/dist-packages/g2p_seq2seq-6.2.2a0-py2.7.egg/g2p_seq2seq/g2p.py", line 394, in freeze variable_names_blacklist=['global_step']) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/deprecation.py", line 324, in new_func return func(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/graph_util_impl.py", line 245, in convert_variables_to_constants inference_graph = extract_sub_graph(input_graph_def, output_node_names) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/deprecation.py", line 324, in new_func return func(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/graph_util_impl.py", line 181, in extract_sub_graph _assert_nodes_are_present(name_to_node, dest_nodes) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/graph_util_impl.py", line 137, in _assert_nodes_are_present assert d in name_to_node, "%s is not in graph" % d AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph
Does not work with
tensor2tensor 1.8.0
tensorboard 1.13.1
tensorflow-estimator 1.13.0
tensorflow-gpu 1.13.1
same error
AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph
After training a model, I tried to freeze it, but I keep running into the following error:
AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph
Any idea what might be the issue here?
I'm using a fresh clone of g2p and and TF 1.9.0.