TensorSpeech / TensorFlowASR

:zap: TensorFlowASR: Almost State-of-the-art Automatic Speech Recognition in Tensorflow 2. Supported languages that can use characters or subwords
https://huylenguyen.com/asr
Apache License 2.0
929 stars 243 forks source link

Cannot save Conformer model to Tensorflow SavedModel #209

Open ndhuy13 opened 3 years ago

ndhuy13 commented 3 years ago

Hello everyone! After training, I save conformer model to SavedModel format by code below:

config = Config(config_path)
speech_featurizer = TFSpeechFeaturizer(config.speech_config)
text_featurizer = SubwordFeaturizer(config.decoder_config)
conformer = Conformer(**config.model_config, vocabulary_size=text_featurizer.num_classes)
conformer.make(speech_featurizer.shape)
conformer.load_weights(model_path, by_name=True)
conformer.add_featurizers(speech_featurizer, text_featurizer)

class aModule(tf.Module):
    def __init__(self, model):
        self.model = model
    signature_dict = { "inputs": tf.TensorSpec(shape=[None, None, 80, 1], dtype=tf.float32, name="inputs"),
                   "inputs_length": tf.TensorSpec(shape=[None], dtype=tf.int32, name="inputs_length")
                     } 

    @tf.function(input_signature=[signature_dict])
    def pred(self, input_batch):
        result = self.model.recognize(input_batch)
        return { "ASR": result }

module = aModule(conformer)
tf.saved_model.save(module, model_path, signatures={ "serving_default": module.pred})

I got ValueError:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-39-9ed305698598> in <module>
----> 1 tf.saved_model.save(module, model_path, signatures={ "serving_default": module.pred})

~/.local/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in save(obj, export_dir, signatures, options)
    973   meta_graph_def = saved_model.meta_graphs.add()
    974 
--> 975   _, exported_graph, object_saver, asset_info = _build_meta_graph(
    976       obj, export_dir, signatures, options, meta_graph_def)
    977   saved_model.saved_model_schema_version = constants.SAVED_MODEL_SCHEMA_VERSION

~/.local/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in _build_meta_graph(obj, export_dir, signatures, options, meta_graph_def)
   1059   # Note we run this twice since, while constructing the view the first time
   1060   # there can be side effects of creating variables.
-> 1061   _ = _SaveableView(checkpoint_graph_view)
   1062   saveable_view = _SaveableView(checkpoint_graph_view, wrapped_functions)
   1063   object_saver = util.TrackableSaver(checkpoint_graph_view)

~/.local/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in __init__(self, checkpoint_view, wrapped_functions)
    176     self.checkpoint_view = checkpoint_view
    177     trackable_objects, node_ids, slot_variables = (
--> 178         self.checkpoint_view.objects_ids_and_slot_variables())
    179     self.nodes = trackable_objects
    180     self.node_ids = node_ids

~/.local/lib/python3.8/site-packages/tensorflow/python/training/tracking/graph_view.py in objects_ids_and_slot_variables(self)
    421       A tuple of (trackable objects, object -> node id, slot variables)
    422     """
--> 423     trackable_objects, path_to_root = self._breadth_first_traversal()
    424     object_names = object_identity.ObjectIdentityDictionary()
    425     for obj, path in path_to_root.items():

~/.local/lib/python3.8/site-packages/tensorflow/python/training/tracking/graph_view.py in _breadth_first_traversal(self)
    199             % (current_trackable,))
    200       bfs_sorted.append(current_trackable)
--> 201       for name, dependency in self.list_dependencies(current_trackable):
    202         if dependency not in path_to_root:
    203           path_to_root[dependency] = (

~/.local/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in list_dependencies(self, obj)
    112 
    113     used_names = set()
--> 114     for name, dep in super(_AugmentedGraphView, self).list_dependencies(obj):
    115       used_names.add(name)
    116       if name in extra_dependencies:

~/.local/lib/python3.8/site-packages/tensorflow/python/training/tracking/graph_view.py in list_dependencies(self, obj)
    159     # pylint: disable=protected-access
    160     obj._maybe_initialize_trackable()
--> 161     return obj._checkpoint_dependencies
    162     # pylint: enable=protected-access
    163 

~/.local/lib/python3.8/site-packages/tensorflow/python/training/tracking/data_structures.py in _checkpoint_dependencies(self)
    508            "non-trackable object; it will be subsequently ignored." % (self,)))
    509     if self._external_modification:
--> 510       raise ValueError(
    511           ("Unable to save the object %s (a list wrapper constructed to track "
    512            "trackable TensorFlow objects). The wrapped list was modified "

ValueError: Unable to save the object ListWrapper(dict_values([])) (a list wrapper constructed to track trackable TensorFlow objects). The wrapped list was modified outside the wrapper (its final value was dict_values([]), its value when a checkpoint dependency was added was None), which breaks restoration on object creation.

If you don't need this list checkpointed, wrap it in a NoDependency object; it will be subsequently ignored.

Is there anyone who saved to SavedModel? Can you guide me? Thank you.

nglehuy commented 3 years ago

@ndhuy13 I haven't tried this but according to the error message, returning a dict may cause the issue.

ndhuy13 commented 3 years ago

@usimarit Thank you for your comment. I think the reason come from https://github.com/TensorSpeech/TensorFlowASR/blob/main/tensorflow_asr/models/base_model.py#L24. cause the error that I get is

ValueError: Object dictionary contained a non-trackable object: dict_values([]) (for key metrics)

Are there any other ways to save model to SavedModel format? Thank you very much. :)

nglehuy commented 3 years ago

@ndhuy13 Can you comment out that __init__ function of BaseModel and all the functions that uses the self._metrics, then rebuild, load weights to the model (do not compile) and then convert to SavedModel format?

nglehuy commented 3 years ago

Sorry pressed wrong close button 😆

ndhuy13 commented 3 years ago

Unfortunately, I got this error after I comment all the functions that uses the self._metrics and rebuild.

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-27-169e206cad01> in <module>
----> 1 tf.saved_model.save(module, model_sv_path, signatures={ "serving_default": module.pred})

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in save(obj, export_dir, signatures, options)
    974 
    975   _, exported_graph, object_saver, asset_info = _build_meta_graph(
--> 976       obj, export_dir, signatures, options, meta_graph_def)
    977   saved_model.saved_model_schema_version = constants.SAVED_MODEL_SCHEMA_VERSION
    978 

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in _build_meta_graph(obj, export_dir, signatures, options, meta_graph_def)
   1074 
   1075   object_graph_proto = _serialize_object_graph(saveable_view,
-> 1076                                                asset_info.asset_index)
   1077   meta_graph_def.object_graph_def.CopyFrom(object_graph_proto)
   1078 

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in _serialize_object_graph(saveable_view, asset_file_def_index)
    719   for obj, obj_proto in zip(saveable_view.nodes, proto.nodes):
    720     _write_object_proto(obj, obj_proto, asset_file_def_index,
--> 721                         saveable_view.function_name_map)
    722   return proto
    723 

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in _write_object_proto(obj, proto, asset_file_def_index, function_name_map)
    759           version=versions_pb2.VersionDef(
    760               producer=1, min_consumer=1, bad_consumers=[]),
--> 761           metadata=obj._tracking_metadata)
    762       # pylint:enable=protected-access
    763     proto.user_object.CopyFrom(registered_type_proto)

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/engine/base_layer.py in _tracking_metadata(self)
   3009   @property
   3010   def _tracking_metadata(self):
-> 3011     return self._trackable_saved_model_saver.tracking_metadata
   3012 
   3013   def _list_extra_dependencies_for_serialization(self, serialization_cache):

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/base_serialization.py in tracking_metadata(self)
     52     # TODO(kathywu): check that serialized JSON can be loaded (e.g., if an
     53     # object is in the python property)
---> 54     return json_utils.Encoder().encode(self.python_properties)
     55 
     56   def list_extra_dependencies_for_serialization(self, serialization_cache):

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in python_properties(self)
     39   def python_properties(self):
     40     # TODO(kathywu): Add python property validator
---> 41     return self._python_properties_internal()
     42 
     43   def _python_properties_internal(self):

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in _python_properties_internal(self)
     57     )
     58 
---> 59     metadata.update(get_config(self.obj))
     60     if self.obj.input_spec is not None:
     61       # Layer's input_spec has already been type-checked in the property setter.

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in get_config(obj)
    116     # When loading, the program will attempt to revive the object from config,
    117     # and if that fails, the object will be revived from the SavedModel.
--> 118     config = generic_utils.serialize_keras_object(obj)['config']
    119 
    120   if config is not None:

~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/utils/generic_utils.py in serialize_keras_object(instance)
    250       raise e
    251     serialization_config = {}
--> 252     for key, item in config.items():
    253       if isinstance(item, six.string_types):
    254         serialization_config[key] = item

AttributeError: 'NoneType' object has no attribute 'items'
nglehuy commented 3 years ago

@ndhuy13 Sorry for the late reply, I've updated the code in main branch an tested conversion to saved_model in tensorflow 2.6 using script, can you try again?

pavel-esir commented 3 years ago

@usimarit thanks for your effort! I tried to export Conformer Transducer to SavedModel format with your recently uploaded script from master. I used config.yml from https://drive.google.com/drive/folders/1VAihgSB5vGXwIVTl3hkUk95joxY1YbfW with replacing key from null to conformer.subwords then I run python TensorFlowASR/examples/conformer/saved_model.py --subwords --h5 latest.h5 --config config.yml --output conformer_transducer

model successfully has been exported but I was not able to load it to TF after model = tf.saved_model.load("conformer_transducer/") it fails with the error:

ValueError: indices.shape[-1] must be <= params.rank, but saw indices shape: [?,?,1] and params shape: [] for '{{node conformer/conformer_prediction/conformer_prediction_embedding/GatherNd}} = ResourceGatherNd[Tindices=DT_INT32, _output_shapes=[[?,?,320]], dtype=DT_FLOAT](conformer_conformer_prediction_conformer_prediction_embedding_gathernd_resource:0, conformer/conformer_prediction/conformer_prediction_embedding/ExpandDims:0)' with input shapes: [], [?,?,1].

Could you please help me to solve this problem?

hamlatzis commented 2 years ago

I also get ValueError: Object dictionary contained a non-trackable object: dict_values([]) (for key metrics) and have tried with TenserFlow branches master, r2.7 down to r2.4

nglehuy commented 2 years ago

@ndhuy13 @pavel-esir @hamlatzis Please try version v1.0.2 to convert Conformer to saved model and run the saved model for audio inference. Examples: gen_saved_model.py and run_saved_model.py

hamlatzis commented 2 years ago

@usimarit just tried the v1.0.2 and have errors like

TypeError: You are passing KerasTensor(type_spec=TensorSpec(shape=(None, None, 80, 1), dtype=tf.float32, name='input_1'), name='input_1', description="created by layer 'input_1'"), an intermediate Keras symbolic input/output, to a TF API that does not allow registering custom dispatchers, such astf.cond,tf.function, gradient tapes, ortf.map_fn. Keras Functional model construction only supports TF API calls that *do* support dispatching, such astf.math.addortf.reshape. Other APIs cannot be called directly on symbolic Kerasinputs/outputs. You can work around this limitation by putting the operation in a custom Keras layercalland calling that layer on this symbolic input/output.

Also what I've noticed from the gen_saved_model.py you save a single signature file, only for inference. What about if I want to pass a second concrete function for training?

nglehuy commented 2 years ago

@hamlatzis Where did you meet that error?

I haven't done the training in saved_model yet, but for training you will have to save model including the optimizer and a strategy to store checkpoints. I assume you want to do this in order for the model to learn while making predictions on production, is that right?

hyunwin commented 2 years ago

Hi,

Was this issue fixed? I'm also having difficulty converting the Conformer model to a SavedModel format. Can anyone provide step-by-step instructions on how to do this? I was able to run the pretrained Conformer model but need to convert it into a SavedModel format.

Aegon007 commented 2 years ago

您好,您的邮件我已收到。我会尽快给您回复。祝好!