tensorflow / models

Models and examples built with TensorFlow
Other
77.25k stars 45.75k forks source link

Impossible to use trained syntaxnet model #5449

Closed sudoandros closed 6 years ago

sudoandros commented 6 years ago

System information

Describe the problem

Python throws an exception ValueError: Op type not registered 'FeatureSize' in binary running on andrei-K501LB (name of my computer). Make sure the Op and Kernel are registered in the binary running in this process. while building NodeDef 'FeatureSize' Hence, it is impossible to use trained syntaxnet model to analyze text. I haven't find any mentions of such an error occuring in syntaxnet nor at the syntaxnet documentation/tutorial, neither at the stackoverflow. What could go wrong?

Source code / logs

ValueError Traceback (most recent call last)

in () 44 return annotate_sentence 45 ---> 46 segmenter_model = load_model("data/en/segmenter", "spec.textproto", "checkpoint") 47 parser_model = load_model("data/en", "parser_spec.textproto", "checkpoint") in load_model(base_dir, master_spec_name, checkpoint_name) 20 with open(os.path.join(base_dir, master_spec_name), "r") as f: 21 text_format.Merge(f.read(), master_spec) ---> 22 spec_builder.complete_master_spec(master_spec, None, base_dir) 23 logging.set_verbosity(logging.WARN) # Turn off TensorFlow spam. 24 /home/andrei/models/venv/local/lib/python2.7/site-packages/dragnn/python/spec_builder.pyc in complete_master_spec(master_spec, lexicon_corpus, output_path, tf_master) 273 builder = ComponentSpecBuilder(spec.name) 274 builder.spec = spec --> 275 builder.fill_from_resources(output_path, tf_master=tf_master) 276 master_spec.component[i].CopyFrom(builder.spec) 277 /home/andrei/models/venv/local/lib/python2.7/site-packages/dragnn/python/spec_builder.pyc in fill_from_resources(self, resource_path, tf_master) 234 with tf.Session(tf_master) as sess: 235 feature_sizes, domain_sizes, _, num_actions = sess.run( --> 236 gen_parser_ops.feature_size(task_context_str=str(context))) 237 self.spec.num_actions = int(num_actions) 238 for i in xrange(len(feature_sizes)): /home/andrei/models/venv/local/lib/python2.7/site-packages/syntaxnet/ops/gen_parser_ops.pyc in feature_size(task_context, task_context_str, arg_prefix, name) 321 result = _op_def_lib.apply_op("FeatureSize", task_context=task_context, 322 task_context_str=task_context_str, --> 323 arg_prefix=arg_prefix, name=name) 324 return _FeatureSizeOutput._make(result) 325 /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.pyc in apply_op(self, op_type_name, name, **keywords) 326 """ 327 output_structure, is_stateful, op = self._apply_op_helper( --> 328 op_type_name, name, **keywords) 329 if output_structure: 330 outputs = op.outputs /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.pyc in _apply_op_helper(self, op_type_name, name, **keywords) 785 op = g.create_op(op_type_name, inputs, output_types, name=scope, 786 input_types=input_types, attrs=attr_protos, --> 787 op_def=op_def) 788 return output_structure, op_def.is_stateful, op 789 /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in create_op(self, op_type, inputs, dtypes, input_types, name, attrs, op_def, compute_shapes, compute_device) 3390 input_types=input_types, 3391 original_op=self._default_original_op, -> 3392 op_def=op_def) 3393 3394 # Note: shapes are lazily computed with the C API enabled. /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in __init__(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def) 1732 op_def, inputs, node_def.attr) 1733 self._c_op = _create_c_op(self._graph, node_def, grouped_inputs, -> 1734 control_input_ops) 1735 else: 1736 self._c_op = None /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in _create_c_op(graph, node_def, inputs, control_inputs) 1568 except errors.InvalidArgumentError as e: 1569 # Convert to ValueError for backwards compatibility. -> 1570 raise ValueError(str(e)) 1571 1572 return c_op ValueError: Op type not registered 'FeatureSize' in binary running on andrei-K501LB. Make sure the Op and Kernel are registered in the binary running in this process. while building NodeDef 'FeatureSize'
sudoandros commented 6 years ago

I just checked the code in the training_tutorial.ipynb notebook. And it also does not work with similar exception. Only change I did is changing DATA_DIR variable to 'data/es'. Exception is ValueError: Op type not registered 'LexiconBuilder' in binary running on andrei-K501LB. Make sure the Op and Kernel are registered in the binary running in this process. while building NodeDef 'LexiconBuilder'

Traceback: ValueError Traceback (most recent call last)

in () 42 # Constructs lexical resources for SyntaxNet in the given resource path, from 43 # the training data. ---> 44 lexicon.build_lexicon(DATA_DIR, TRAINING_CORPUS_PATH) 45 46 # Construct the 'lookahead' ComponentSpec. This is a simple right-to-left RNN /home/andrei/models/venv/local/lib/python2.7/site-packages/dragnn/python/lexicon.pyc in build_lexicon(output_path, training_corpus_path, tf_master, training_corpus_format, morph_to_pos, **kwargs) 71 sess.run( 72 gen_parser_ops.lexicon_builder( ---> 73 task_context_str=str(context), corpus_name='corpus', **kwargs)) /home/andrei/models/venv/local/lib/python2.7/site-packages/syntaxnet/ops/gen_parser_ops.pyc in lexicon_builder(task_context, task_context_str, corpus_name, lexicon_max_prefix_length, lexicon_max_suffix_length, lexicon_min_char_ngram_length, lexicon_max_char_ngram_length, lexicon_char_ngram_include_terminators, lexicon_char_ngram_mark_boundaries, name) 442 lexicon_char_ngram_include_terminators=lexicon_char_ngram_include_terminators, 443 lexicon_char_ngram_mark_boundaries=lexicon_char_ngram_mark_boundaries, --> 444 name=name) 445 return result 446 /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.pyc in apply_op(self, op_type_name, name, **keywords) 326 """ 327 output_structure, is_stateful, op = self._apply_op_helper( --> 328 op_type_name, name, **keywords) 329 if output_structure: 330 outputs = op.outputs /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.pyc in _apply_op_helper(self, op_type_name, name, **keywords) 785 op = g.create_op(op_type_name, inputs, output_types, name=scope, 786 input_types=input_types, attrs=attr_protos, --> 787 op_def=op_def) 788 return output_structure, op_def.is_stateful, op 789 /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in create_op(self, op_type, inputs, dtypes, input_types, name, attrs, op_def, compute_shapes, compute_device) 3390 input_types=input_types, 3391 original_op=self._default_original_op, -> 3392 op_def=op_def) 3393 3394 # Note: shapes are lazily computed with the C API enabled. /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in __init__(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def) 1732 op_def, inputs, node_def.attr) 1733 self._c_op = _create_c_op(self._graph, node_def, grouped_inputs, -> 1734 control_input_ops) 1735 else: 1736 self._c_op = None /home/andrei/models/venv/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.pyc in _create_c_op(graph, node_def, inputs, control_inputs) 1568 except errors.InvalidArgumentError as e: 1569 # Convert to ValueError for backwards compatibility. -> 1570 raise ValueError(str(e)) 1571 1572 return c_op ValueError: Op type not registered 'LexiconBuilder' in binary running on andrei-K501LB. Make sure the Op and Kernel are registered in the binary running in this process. while building NodeDef 'LexiconBuilder'
sudoandros commented 6 years ago

I have built syntaxnet and dragnn from source using instructions from manual installation section. Exactly the same behavior. I don't have any idea what else can I do to avoid this problem.

paulhager commented 5 years ago

Did you find a solution to this problem or just give up? I'm also running into this problem while trying to set-up wrapper scripts for the original syntaxnet (as per https://github.com/spoddutur/syntaxnet or https://github.com/IINemo/syntaxnet_api_server) and from my research it might be a problem between the syntaxnet version used to compile the graph and the one used to execute it? That this FeatureSize operation is no longer registered as it was only used in a (now) antiquated version?

sudoandros commented 5 years ago

@paulhager I'm not sure what happened but I've just deleted everything and did a clean binary install. As far as I remember problems could have been caused by conflict between syntaxnet built from source and syntaxnet installed from binary. Try to check that you have only one version of syntaxnet on your computer