Open ndhuy13 opened 3 years ago
@ndhuy13 I haven't tried this but according to the error message, returning a dict may cause the issue.
@usimarit Thank you for your comment. I think the reason come from https://github.com/TensorSpeech/TensorFlowASR/blob/main/tensorflow_asr/models/base_model.py#L24. cause the error that I get is
ValueError: Object dictionary contained a non-trackable object: dict_values([]) (for key metrics)
Are there any other ways to save model to SavedModel format? Thank you very much. :)
@ndhuy13 Can you comment out that __init__
function of BaseModel
and all the functions that uses the self._metrics
, then rebuild, load weights to the model (do not compile) and then convert to SavedModel format?
Sorry pressed wrong close button 😆
Unfortunately, I got this error after I comment all the functions that uses the self._metrics and rebuild.
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-27-169e206cad01> in <module>
----> 1 tf.saved_model.save(module, model_sv_path, signatures={ "serving_default": module.pred})
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in save(obj, export_dir, signatures, options)
974
975 _, exported_graph, object_saver, asset_info = _build_meta_graph(
--> 976 obj, export_dir, signatures, options, meta_graph_def)
977 saved_model.saved_model_schema_version = constants.SAVED_MODEL_SCHEMA_VERSION
978
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in _build_meta_graph(obj, export_dir, signatures, options, meta_graph_def)
1074
1075 object_graph_proto = _serialize_object_graph(saveable_view,
-> 1076 asset_info.asset_index)
1077 meta_graph_def.object_graph_def.CopyFrom(object_graph_proto)
1078
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in _serialize_object_graph(saveable_view, asset_file_def_index)
719 for obj, obj_proto in zip(saveable_view.nodes, proto.nodes):
720 _write_object_proto(obj, obj_proto, asset_file_def_index,
--> 721 saveable_view.function_name_map)
722 return proto
723
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/saved_model/save.py in _write_object_proto(obj, proto, asset_file_def_index, function_name_map)
759 version=versions_pb2.VersionDef(
760 producer=1, min_consumer=1, bad_consumers=[]),
--> 761 metadata=obj._tracking_metadata)
762 # pylint:enable=protected-access
763 proto.user_object.CopyFrom(registered_type_proto)
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/engine/base_layer.py in _tracking_metadata(self)
3009 @property
3010 def _tracking_metadata(self):
-> 3011 return self._trackable_saved_model_saver.tracking_metadata
3012
3013 def _list_extra_dependencies_for_serialization(self, serialization_cache):
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/base_serialization.py in tracking_metadata(self)
52 # TODO(kathywu): check that serialized JSON can be loaded (e.g., if an
53 # object is in the python property)
---> 54 return json_utils.Encoder().encode(self.python_properties)
55
56 def list_extra_dependencies_for_serialization(self, serialization_cache):
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in python_properties(self)
39 def python_properties(self):
40 # TODO(kathywu): Add python property validator
---> 41 return self._python_properties_internal()
42
43 def _python_properties_internal(self):
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in _python_properties_internal(self)
57 )
58
---> 59 metadata.update(get_config(self.obj))
60 if self.obj.input_spec is not None:
61 # Layer's input_spec has already been type-checked in the property setter.
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in get_config(obj)
116 # When loading, the program will attempt to revive the object from config,
117 # and if that fails, the object will be revived from the SavedModel.
--> 118 config = generic_utils.serialize_keras_object(obj)['config']
119
120 if config is not None:
~/.conda/envs/tf_asr/lib/python3.6/site-packages/tensorflow/python/keras/utils/generic_utils.py in serialize_keras_object(instance)
250 raise e
251 serialization_config = {}
--> 252 for key, item in config.items():
253 if isinstance(item, six.string_types):
254 serialization_config[key] = item
AttributeError: 'NoneType' object has no attribute 'items'
@ndhuy13 Sorry for the late reply, I've updated the code in main branch an tested conversion to saved_model in tensorflow 2.6 using script, can you try again?
@usimarit thanks for your effort! I tried to export Conformer Transducer to SavedModel format with your recently uploaded script from master. I used config.yml
from
https://drive.google.com/drive/folders/1VAihgSB5vGXwIVTl3hkUk95joxY1YbfW
with replacing key from null
to conformer.subwords
then I run
python TensorFlowASR/examples/conformer/saved_model.py --subwords --h5 latest.h5 --config config.yml --output conformer_transducer
model successfully has been exported but I was not able to load it to TF after model = tf.saved_model.load("conformer_transducer/")
it fails with the error:
ValueError: indices.shape[-1] must be <= params.rank, but saw indices shape: [?,?,1] and params shape: [] for '{{node conformer/conformer_prediction/conformer_prediction_embedding/GatherNd}} = ResourceGatherNd[Tindices=DT_INT32, _output_shapes=[[?,?,320]], dtype=DT_FLOAT](conformer_conformer_prediction_conformer_prediction_embedding_gathernd_resource:0, conformer/conformer_prediction/conformer_prediction_embedding/ExpandDims:0)' with input shapes: [], [?,?,1].
Could you please help me to solve this problem?
I also get ValueError: Object dictionary contained a non-trackable object: dict_values([]) (for key metrics) and have tried with TenserFlow branches master, r2.7 down to r2.4
@ndhuy13 @pavel-esir @hamlatzis Please try version v1.0.2
to convert Conformer to saved model and run the saved model for audio inference.
Examples: gen_saved_model.py and run_saved_model.py
@usimarit just tried the v1.0.2 and have errors like
TypeError: You are passing KerasTensor(type_spec=TensorSpec(shape=(None, None, 80, 1), dtype=tf.float32, name='input_1'), name='input_1', description="created by layer 'input_1'"), an intermediate Keras symbolic input/output, to a TF API that does not allow registering custom dispatchers, such as
tf.cond,
tf.function, gradient tapes, or
tf.map_fn. Keras Functional model construction only supports TF API calls that *do* support dispatching, such as
tf.math.addor
tf.reshape. Other APIs cannot be called directly on symbolic Kerasinputs/outputs. You can work around this limitation by putting the operation in a custom Keras layer
calland calling that layer on this symbolic input/output.
Also what I've noticed from the gen_saved_model.py
you save a single signature file, only for inference. What about if I want to pass a second concrete function for training
?
@hamlatzis Where did you meet that error?
I haven't done the training in saved_model yet, but for training you will have to save model including the optimizer and a strategy to store checkpoints. I assume you want to do this in order for the model to learn while making predictions on production, is that right?
Hi,
Was this issue fixed? I'm also having difficulty converting the Conformer model to a SavedModel format. Can anyone provide step-by-step instructions on how to do this? I was able to run the pretrained Conformer model but need to convert it into a SavedModel format.
您好,您的邮件我已收到。我会尽快给您回复。祝好!
Hello everyone! After training, I save conformer model to SavedModel format by code below:
I got ValueError:
Is there anyone who saved to SavedModel? Can you guide me? Thank you.