tensorflow / models

Models and examples built with TensorFlow
Other
76.98k stars 45.79k forks source link

Saving (& Loading) models #8862

Closed cattmi closed 4 years ago

cattmi commented 4 years ago

TF_2_2_colab_object_detection_20200713b_TF2_2_cut_down.zip

Prerequisites

Please answer the following questions for yourself before submitting an issue.

1. The entire URL of the file you are using

https://github.com/tensorflow/models/blob/master/research/object_detection/colab_tutorials/eager_few_shot_od_training_tf2_colab.ipynb

Please see attached zip file with jupyter notebook based on the 'eager_few_shot_od_training_tf2_colab.ipynb' authored by 'Tombstone' at google. This is currently inaccessible at time of posting, but is listed under: https://github.com/tensorflow/models/tree/master/research/object_detection/colab_tutorials

Differences:

Migrated to Jupyter notebook in newly created Tensorflow 2.2 environment

Apologies: Should be under 'RESEARCHmodels'

2. Describe the bug

A clear and concise description of what the bug is: Model trains successfully but both 1) model.save in form detection_model.save(model_dest)

2) tf.saved_mode.save() in form tf.saved_model.save( detection_model, model_dest, signatures=None, options=None ) both yield errors

3. Steps to reproduce

Steps to reproduce the behavior.

# save trained model : model.save()
model_directory = '/home/michael/jupyter_notebooks_TF_2_2/models/'
model_name = 'TF_2_2_colab_DOT_object_detection_20200713a'
model_dest = os.path.join(os.sep, model_directory, model_name)
detection_model.save(model_dest) 

yields output:

AttributeError Traceback (most recent call last)

in 9 #) 10 ---> 11 detection_model.save(model_dest)

AttributeError: 'SSDMetaArch' object has no attribute 'save'

while ``# saved_model trained model

model_directory = '/home/michael/jupyter_notebooks_TF_2_2/models/' model_name = 'TF_2_2_colab_DOT_object_detection_20200713a' model_dest = os.path.join(os.sep, model_directory, model_name)

tf.saved_model.save(to_export, '/tmp/adder')

tf.saved_model.save( detection_model, model_dest, signatures=None, options=None )

WARNING:tensorflow:Skipping full serialization of Keras layer <object_detection.meta_architectures.ssd_meta_arch.SSDMetaArch object at 0x7f8f40dffee0>, because it is not built.


TypeError Traceback (most recent call last)

in 6 #tf.saved_model.save(to_export, '/tmp/adder') 7 ----> 8 tf.saved_model.save( 9 detection_model, model_dest, signatures=None, options=None 10 ) ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in save(obj, export_dir, signatures, options) 948 meta_graph_def = saved_model.meta_graphs.add() 949 --> 950 _, exported_graph, object_saver, asset_info = _build_meta_graph( 951 obj, export_dir, signatures, options, meta_graph_def) 952 saved_model.saved_model_schema_version = constants.SAVED_MODEL_SCHEMA_VERSION ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in _build_meta_graph(obj, export_dir, signatures, options, meta_graph_def) 1020 # Note we run this twice since, while constructing the view the first time 1021 # there can be side effects of creating variables. -> 1022 _ = _SaveableView(checkpoint_graph_view) 1023 saveable_view = _SaveableView(checkpoint_graph_view, wrapped_functions) 1024 object_saver = util.TrackableSaver(checkpoint_graph_view) ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in __init__(self, checkpoint_view, wrapped_functions) 171 self.checkpoint_view = checkpoint_view 172 trackable_objects, node_ids, slot_variables = ( --> 173 self.checkpoint_view.objects_ids_and_slot_variables()) 174 self.nodes = trackable_objects 175 self.node_ids = node_ids ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/training/tracking/graph_view.py in objects_ids_and_slot_variables(self) 413 A tuple of (trackable objects, object -> node id, slot variables) 414 """ --> 415 trackable_objects, path_to_root = self._breadth_first_traversal() 416 object_names = object_identity.ObjectIdentityDictionary() 417 for obj, path in path_to_root.items(): ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/training/tracking/graph_view.py in _breadth_first_traversal(self) 197 % (current_trackable,)) 198 bfs_sorted.append(current_trackable) --> 199 for name, dependency in self.list_dependencies(current_trackable): 200 if dependency not in path_to_root: 201 path_to_root[dependency] = ( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in list_dependencies(self, obj) 106 def list_dependencies(self, obj): 107 """Overrides a parent method to include `add_object` objects.""" --> 108 extra_dependencies = self.list_extra_dependencies(obj) 109 extra_dependencies.update(self._extra_dependencies.get(obj, {})) 110 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py in list_extra_dependencies(self, obj) 134 135 def list_extra_dependencies(self, obj): --> 136 return obj._list_extra_dependencies_for_serialization( # pylint: disable=protected-access 137 self._serialization_cache) 138 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/engine/base_layer.py in _list_extra_dependencies_for_serialization(self, serialization_cache) 2743 2744 def _list_extra_dependencies_for_serialization(self, serialization_cache): -> 2745 return (self._trackable_saved_model_saver 2746 .list_extra_dependencies_for_serialization(serialization_cache)) 2747 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/base_serialization.py in list_extra_dependencies_for_serialization(self, serialization_cache) 72 of attributes are listed in the `saved_model._LayerAttributes` class. 73 """ ---> 74 return self.objects_to_serialize(serialization_cache) 75 76 def list_functions_for_serialization(self, serialization_cache): ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in objects_to_serialize(self, serialization_cache) 70 71 def objects_to_serialize(self, serialization_cache): ---> 72 return (self._get_serialized_attributes( 73 serialization_cache).objects_to_serialize) 74 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in _get_serialized_attributes(self, serialization_cache) 89 return serialized_attr 90 ---> 91 object_dict, function_dict = self._get_serialized_attributes_internal( 92 serialization_cache) 93 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/model_serialization.py in _get_serialized_attributes_internal(self, serialization_cache) 50 # the ones serialized by Layer. 51 objects, functions = ( ---> 52 super(ModelSavedModelSaver, self)._get_serialized_attributes_internal( 53 serialization_cache)) 54 functions['_default_save_signature'] = default_signature ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/layer_serialization.py in _get_serialized_attributes_internal(self, serialization_cache) 99 """Returns dictionary of serialized attributes.""" 100 objects = save_impl.wrap_layer_objects(self.obj, serialization_cache) --> 101 functions = save_impl.wrap_layer_functions(self.obj, serialization_cache) 102 # Attribute validator requires that the default save signature is added to 103 # function dict, even if the value is None. ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrap_layer_functions(layer, serialization_cache) 159 # call with losses) are traced with the same inputs. 160 call_collection = LayerCallCollection(layer) --> 161 call_fn_with_losses = call_collection.add_function( 162 _wrap_call_and_conditional_losses(layer), 163 '{}_layer_call_and_return_conditional_losses'.format(layer.name)) ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in add_function(self, call_fn, name) 501 # Manually add traces for layers that have keyword arguments and have 502 # a fully defined input signature. --> 503 self.add_trace(*self._input_signature) 504 return fn 505 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in add_trace(self, *args, **kwargs) 416 fn.get_concrete_function(*args, **kwargs) 417 --> 418 trace_with_training(True) 419 trace_with_training(False) 420 else: ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in trace_with_training(value, fn) 414 utils.set_training_arg(value, self._training_arg_index, args, kwargs) 415 with K.learning_phase_scope(value): --> 416 fn.get_concrete_function(*args, **kwargs) 417 418 trace_with_training(True) ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in get_concrete_function(self, *args, **kwargs) 545 if not self.call_collection.tracing: 546 self.call_collection.add_trace(*args, **kwargs) --> 547 return super(LayerCall, self).get_concrete_function(*args, **kwargs) 548 549 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in get_concrete_function(self, *args, **kwargs) 957 ValueError: if this object has not yet been called on concrete values. 958 """ --> 959 concrete = self._get_concrete_function_garbage_collected(*args, **kwargs) 960 concrete._garbage_collector.release() # pylint: disable=protected-access 961 return concrete ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _get_concrete_function_garbage_collected(self, *args, **kwargs) 863 if self._stateful_fn is None: 864 initializers = [] --> 865 self._initialize(args, kwargs, add_initializers_to=initializers) 866 self._initialize_uninitialized_variables(initializers) 867 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _initialize(self, args, kwds, add_initializers_to) 503 self._graph_deleter = FunctionDeleter(self._lifted_initializer_graph) 504 self._concrete_stateful_fn = ( --> 505 self._stateful_fn._get_concrete_function_internal_garbage_collected( # pylint: disable=protected-access 506 *args, **kwds)) 507 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _get_concrete_function_internal_garbage_collected(self, *args, **kwargs) 2444 args, kwargs = None, None 2445 with self._lock: -> 2446 graph_function, _, _ = self._maybe_define_function(args, kwargs) 2447 return graph_function 2448 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _maybe_define_function(self, args, kwargs) 2775 2776 self._function_cache.missed.add(call_context_key) -> 2777 graph_function = self._create_graph_function(args, kwargs) 2778 self._function_cache.primary[cache_key] = graph_function 2779 return graph_function, args, kwargs ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes) 2655 arg_names = base_arg_names + missing_arg_names 2656 graph_function = ConcreteFunction( -> 2657 func_graph_module.func_graph_from_py_func( 2658 self._name, 2659 self._python_function, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes) 979 _, original_func = tf_decorator.unwrap(python_func) 980 --> 981 func_outputs = python_func(*func_args, **func_kwargs) 982 983 # invariant: `func_outputs` contains only Tensors, CompositeTensors, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in wrapped_fn(*args, **kwds) 439 # __wrapped__ allows AutoGraph to swap in a converted function. We give 440 # the function a weak reference to itself to avoid a reference cycle. --> 441 return weak_wrapped_fn().__wrapped__(*args, **kwds) 442 weak_wrapped_fn = weakref.ref(wrapped_fn) 443 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrapper(*args, **kwargs) 522 saving=True): 523 with base_layer_utils.autocast_context_manager(layer._compute_dtype): # pylint: disable=protected-access --> 524 ret = method(*args, **kwargs) 525 _restore_layer_losses(original_losses) 526 return ret ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in wrap_with_training_arg(*args, **kwargs) 165 return wrapped_call(*args, **kwargs) 166 --> 167 return tf_utils.smart_cond( 168 training, 169 lambda: replace_training_and_call(True), ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/utils/tf_utils.py in smart_cond(pred, true_fn, false_fn, name) 62 return control_flow_ops.cond( 63 pred, true_fn=true_fn, false_fn=false_fn, name=name) ---> 64 return smart_module.smart_cond( 65 pred, true_fn=true_fn, false_fn=false_fn, name=name) 66 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/smart_cond.py in smart_cond(pred, true_fn, false_fn, name) 52 if pred_value is not None: 53 if pred_value: ---> 54 return true_fn() 55 else: 56 return false_fn() ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in () 167 return tf_utils.smart_cond( 168 training, --> 169 lambda: replace_training_and_call(True), 170 lambda: replace_training_and_call(False)) 171 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in replace_training_and_call(training) 163 def replace_training_and_call(training): 164 set_training_arg(training, training_arg_index, args, kwargs) --> 165 return wrapped_call(*args, **kwargs) 166 167 return tf_utils.smart_cond( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in call_and_return_conditional_losses(inputs, *args, **kwargs) 564 layer_call = _get_layer_call_method(layer) 565 def call_and_return_conditional_losses(inputs, *args, **kwargs): --> 566 return layer_call(inputs, *args, **kwargs), layer.get_losses_for(inputs) 567 return _create_call_fn_decorator(layer, call_and_return_conditional_losses) 568 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/object_detection/meta_architectures/ssd_meta_arch.py in call(self, inputs, **kwargs) 249 # method. 250 def call(self, inputs, **kwargs): --> 251 return self._extract_features(inputs) 252 253 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/object_detection/models/ssd_resnet_v1_fpn_keras_feature_extractor.py in _extract_features(self, preprocessed_inputs) 233 (feature_block, feature_block_map[feature_block]) 234 for feature_block in feature_block_list] --> 235 fpn_features = self._fpn_features_generator(fpn_input_image_features) 236 237 feature_maps = [] ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs) 966 with base_layer_utils.autocast_context_manager( 967 self._compute_dtype): --> 968 outputs = self.call(cast_inputs, *args, **kwargs) 969 self._handle_activity_regularization(inputs, outputs) 970 self._set_mask_metadata(inputs, outputs, input_masks) ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in return_outputs_and_add_losses(*args, **kwargs) 69 inputs = args[inputs_arg_index] 70 args = args[inputs_arg_index + 1:] ---> 71 outputs, losses = fn(inputs, *args, **kwargs) 72 layer.add_loss(losses, inputs) 73 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in wrap_with_training_arg(*args, **kwargs) 165 return wrapped_call(*args, **kwargs) 166 --> 167 return tf_utils.smart_cond( 168 training, 169 lambda: replace_training_and_call(True), ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/utils/tf_utils.py in smart_cond(pred, true_fn, false_fn, name) 62 return control_flow_ops.cond( 63 pred, true_fn=true_fn, false_fn=false_fn, name=name) ---> 64 return smart_module.smart_cond( 65 pred, true_fn=true_fn, false_fn=false_fn, name=name) 66 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/smart_cond.py in smart_cond(pred, true_fn, false_fn, name) 52 if pred_value is not None: 53 if pred_value: ---> 54 return true_fn() 55 else: 56 return false_fn() ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in () 167 return tf_utils.smart_cond( 168 training, --> 169 lambda: replace_training_and_call(True), 170 lambda: replace_training_and_call(False)) 171 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in replace_training_and_call(training) 163 def replace_training_and_call(training): 164 set_training_arg(training, training_arg_index, args, kwargs) --> 165 return wrapped_call(*args, **kwargs) 166 167 return tf_utils.smart_cond( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in __call__(self, *args, **kwargs) 539 def __call__(self, *args, **kwargs): 540 if not self.call_collection.tracing: --> 541 self.call_collection.add_trace(*args, **kwargs) 542 return super(LayerCall, self).__call__(*args, **kwargs) 543 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in add_trace(self, *args, **kwargs) 416 fn.get_concrete_function(*args, **kwargs) 417 --> 418 trace_with_training(True) 419 trace_with_training(False) 420 else: ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in trace_with_training(value, fn) 414 utils.set_training_arg(value, self._training_arg_index, args, kwargs) 415 with K.learning_phase_scope(value): --> 416 fn.get_concrete_function(*args, **kwargs) 417 418 trace_with_training(True) ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in get_concrete_function(self, *args, **kwargs) 545 if not self.call_collection.tracing: 546 self.call_collection.add_trace(*args, **kwargs) --> 547 return super(LayerCall, self).get_concrete_function(*args, **kwargs) 548 549 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in get_concrete_function(self, *args, **kwargs) 957 ValueError: if this object has not yet been called on concrete values. 958 """ --> 959 concrete = self._get_concrete_function_garbage_collected(*args, **kwargs) 960 concrete._garbage_collector.release() # pylint: disable=protected-access 961 return concrete ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _get_concrete_function_garbage_collected(self, *args, **kwargs) 863 if self._stateful_fn is None: 864 initializers = [] --> 865 self._initialize(args, kwargs, add_initializers_to=initializers) 866 self._initialize_uninitialized_variables(initializers) 867 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _initialize(self, args, kwds, add_initializers_to) 503 self._graph_deleter = FunctionDeleter(self._lifted_initializer_graph) 504 self._concrete_stateful_fn = ( --> 505 self._stateful_fn._get_concrete_function_internal_garbage_collected( # pylint: disable=protected-access 506 *args, **kwds)) 507 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _get_concrete_function_internal_garbage_collected(self, *args, **kwargs) 2444 args, kwargs = None, None 2445 with self._lock: -> 2446 graph_function, _, _ = self._maybe_define_function(args, kwargs) 2447 return graph_function 2448 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _maybe_define_function(self, args, kwargs) 2775 2776 self._function_cache.missed.add(call_context_key) -> 2777 graph_function = self._create_graph_function(args, kwargs) 2778 self._function_cache.primary[cache_key] = graph_function 2779 return graph_function, args, kwargs ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes) 2655 arg_names = base_arg_names + missing_arg_names 2656 graph_function = ConcreteFunction( -> 2657 func_graph_module.func_graph_from_py_func( 2658 self._name, 2659 self._python_function, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes) 979 _, original_func = tf_decorator.unwrap(python_func) 980 --> 981 func_outputs = python_func(*func_args, **func_kwargs) 982 983 # invariant: `func_outputs` contains only Tensors, CompositeTensors, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in wrapped_fn(*args, **kwds) 439 # __wrapped__ allows AutoGraph to swap in a converted function. We give 440 # the function a weak reference to itself to avoid a reference cycle. --> 441 return weak_wrapped_fn().__wrapped__(*args, **kwds) 442 weak_wrapped_fn = weakref.ref(wrapped_fn) 443 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrapper(*args, **kwargs) 522 saving=True): 523 with base_layer_utils.autocast_context_manager(layer._compute_dtype): # pylint: disable=protected-access --> 524 ret = method(*args, **kwargs) 525 _restore_layer_losses(original_losses) 526 return ret ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrap_with_training_arg(*args, **kwargs) 482 kwargs = kwargs.copy() 483 utils.remove_training_arg(self._training_arg_index, args, kwargs) --> 484 return call_fn(*args, **kwargs) 485 486 return tf_decorator.make_decorator( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in call_and_return_conditional_losses(inputs, *args, **kwargs) 564 layer_call = _get_layer_call_method(layer) 565 def call_and_return_conditional_losses(inputs, *args, **kwargs): --> 566 return layer_call(inputs, *args, **kwargs), layer.get_losses_for(inputs) 567 return _create_call_fn_decorator(layer, call_and_return_conditional_losses) 568 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in return_outputs_and_add_losses(*args, **kwargs) 69 inputs = args[inputs_arg_index] 70 args = args[inputs_arg_index + 1:] ---> 71 outputs, losses = fn(inputs, *args, **kwargs) 72 layer.add_loss(losses, inputs) 73 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in wrap_with_training_arg(*args, **kwargs) 165 return wrapped_call(*args, **kwargs) 166 --> 167 return tf_utils.smart_cond( 168 training, 169 lambda: replace_training_and_call(True), ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/utils/tf_utils.py in smart_cond(pred, true_fn, false_fn, name) 62 return control_flow_ops.cond( 63 pred, true_fn=true_fn, false_fn=false_fn, name=name) ---> 64 return smart_module.smart_cond( 65 pred, true_fn=true_fn, false_fn=false_fn, name=name) 66 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/smart_cond.py in smart_cond(pred, true_fn, false_fn, name) 52 if pred_value is not None: 53 if pred_value: ---> 54 return true_fn() 55 else: 56 return false_fn() ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in () 167 return tf_utils.smart_cond( 168 training, --> 169 lambda: replace_training_and_call(True), 170 lambda: replace_training_and_call(False)) 171 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in replace_training_and_call(training) 163 def replace_training_and_call(training): 164 set_training_arg(training, training_arg_index, args, kwargs) --> 165 return wrapped_call(*args, **kwargs) 166 167 return tf_utils.smart_cond( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in __call__(self, *args, **kwargs) 540 if not self.call_collection.tracing: 541 self.call_collection.add_trace(*args, **kwargs) --> 542 return super(LayerCall, self).__call__(*args, **kwargs) 543 544 def get_concrete_function(self, *args, **kwargs): ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds) 578 xla_context.Exit() 579 else: --> 580 result = self._call(*args, **kwds) 581 582 if tracing_count == self._get_tracing_count(): ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds) 616 # In this case we have not created variables on the first call. So we can 617 # run the first trace but we should fail if variables are created. --> 618 results = self._stateful_fn(*args, **kwds) 619 if self._created_variables: 620 raise ValueError("Creating variables on a non-first call to a function" ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs) 2417 """Calls a graph function specialized to the inputs.""" 2418 with self._lock: -> 2419 graph_function, args, kwargs = self._maybe_define_function(args, kwargs) 2420 return graph_function._filtered_call(args, kwargs) # pylint: disable=protected-access 2421 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _maybe_define_function(self, args, kwargs) 2775 2776 self._function_cache.missed.add(call_context_key) -> 2777 graph_function = self._create_graph_function(args, kwargs) 2778 self._function_cache.primary[cache_key] = graph_function 2779 return graph_function, args, kwargs ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes) 2655 arg_names = base_arg_names + missing_arg_names 2656 graph_function = ConcreteFunction( -> 2657 func_graph_module.func_graph_from_py_func( 2658 self._name, 2659 self._python_function, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes) 979 _, original_func = tf_decorator.unwrap(python_func) 980 --> 981 func_outputs = python_func(*func_args, **func_kwargs) 982 983 # invariant: `func_outputs` contains only Tensors, CompositeTensors, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in wrapped_fn(*args, **kwds) 439 # __wrapped__ allows AutoGraph to swap in a converted function. We give 440 # the function a weak reference to itself to avoid a reference cycle. --> 441 return weak_wrapped_fn().__wrapped__(*args, **kwds) 442 weak_wrapped_fn = weakref.ref(wrapped_fn) 443 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrapper(*args, **kwargs) 522 saving=True): 523 with base_layer_utils.autocast_context_manager(layer._compute_dtype): # pylint: disable=protected-access --> 524 ret = method(*args, **kwargs) 525 _restore_layer_losses(original_losses) 526 return ret ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrap_with_training_arg(*args, **kwargs) 482 kwargs = kwargs.copy() 483 utils.remove_training_arg(self._training_arg_index, args, kwargs) --> 484 return call_fn(*args, **kwargs) 485 486 return tf_decorator.make_decorator( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in call_and_return_conditional_losses(inputs, *args, **kwargs) 564 layer_call = _get_layer_call_method(layer) 565 def call_and_return_conditional_losses(inputs, *args, **kwargs): --> 566 return layer_call(inputs, *args, **kwargs), layer.get_losses_for(inputs) 567 return _create_call_fn_decorator(layer, call_and_return_conditional_losses) 568 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in return_outputs_and_add_losses(*args, **kwargs) 69 inputs = args[inputs_arg_index] 70 args = args[inputs_arg_index + 1:] ---> 71 outputs, losses = fn(inputs, *args, **kwargs) 72 layer.add_loss(losses, inputs) 73 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in wrap_with_training_arg(*args, **kwargs) 165 return wrapped_call(*args, **kwargs) 166 --> 167 return tf_utils.smart_cond( 168 training, 169 lambda: replace_training_and_call(True), ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/utils/tf_utils.py in smart_cond(pred, true_fn, false_fn, name) 62 return control_flow_ops.cond( 63 pred, true_fn=true_fn, false_fn=false_fn, name=name) ---> 64 return smart_module.smart_cond( 65 pred, true_fn=true_fn, false_fn=false_fn, name=name) 66 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/smart_cond.py in smart_cond(pred, true_fn, false_fn, name) 52 if pred_value is not None: 53 if pred_value: ---> 54 return true_fn() 55 else: 56 return false_fn() ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in () 167 return tf_utils.smart_cond( 168 training, --> 169 lambda: replace_training_and_call(True), 170 lambda: replace_training_and_call(False)) 171 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/utils.py in replace_training_and_call(training) 163 def replace_training_and_call(training): 164 set_training_arg(training, training_arg_index, args, kwargs) --> 165 return wrapped_call(*args, **kwargs) 166 167 return tf_utils.smart_cond( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in __call__(self, *args, **kwargs) 540 if not self.call_collection.tracing: 541 self.call_collection.add_trace(*args, **kwargs) --> 542 return super(LayerCall, self).__call__(*args, **kwargs) 543 544 def get_concrete_function(self, *args, **kwargs): ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds) 578 xla_context.Exit() 579 else: --> 580 result = self._call(*args, **kwds) 581 582 if tracing_count == self._get_tracing_count(): ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds) 616 # In this case we have not created variables on the first call. So we can 617 # run the first trace but we should fail if variables are created. --> 618 results = self._stateful_fn(*args, **kwds) 619 if self._created_variables: 620 raise ValueError("Creating variables on a non-first call to a function" ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs) 2417 """Calls a graph function specialized to the inputs.""" 2418 with self._lock: -> 2419 graph_function, args, kwargs = self._maybe_define_function(args, kwargs) 2420 return graph_function._filtered_call(args, kwargs) # pylint: disable=protected-access 2421 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _maybe_define_function(self, args, kwargs) 2775 2776 self._function_cache.missed.add(call_context_key) -> 2777 graph_function = self._create_graph_function(args, kwargs) 2778 self._function_cache.primary[cache_key] = graph_function 2779 return graph_function, args, kwargs ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes) 2655 arg_names = base_arg_names + missing_arg_names 2656 graph_function = ConcreteFunction( -> 2657 func_graph_module.func_graph_from_py_func( 2658 self._name, 2659 self._python_function, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes) 979 _, original_func = tf_decorator.unwrap(python_func) 980 --> 981 func_outputs = python_func(*func_args, **func_kwargs) 982 983 # invariant: `func_outputs` contains only Tensors, CompositeTensors, ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py in wrapped_fn(*args, **kwds) 439 # __wrapped__ allows AutoGraph to swap in a converted function. We give 440 # the function a weak reference to itself to avoid a reference cycle. --> 441 return weak_wrapped_fn().__wrapped__(*args, **kwds) 442 weak_wrapped_fn = weakref.ref(wrapped_fn) 443 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrapper(*args, **kwargs) 522 saving=True): 523 with base_layer_utils.autocast_context_manager(layer._compute_dtype): # pylint: disable=protected-access --> 524 ret = method(*args, **kwargs) 525 _restore_layer_losses(original_losses) 526 return ret ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in wrap_with_training_arg(*args, **kwargs) 482 kwargs = kwargs.copy() 483 utils.remove_training_arg(self._training_arg_index, args, kwargs) --> 484 return call_fn(*args, **kwargs) 485 486 return tf_decorator.make_decorator( ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/saving/saved_model/save_impl.py in call_and_return_conditional_losses(inputs, *args, **kwargs) 564 layer_call = _get_layer_call_method(layer) 565 def call_and_return_conditional_losses(inputs, *args, **kwargs): --> 566 return layer_call(inputs, *args, **kwargs), layer.get_losses_for(inputs) 567 return _create_call_fn_decorator(layer, call_and_return_conditional_losses) 568 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/engine/base_layer.py in get_losses_for(self, inputs) 1625 losses = [l for l in self.losses if not l._unconditional_loss] 1626 inputs = nest.flatten(inputs) -> 1627 reachable = tf_utils.get_reachable_from_inputs(inputs, losses) 1628 return [l for l in losses if l in reachable] 1629 ~/anaconda3/envs/TF_2_2/lib/python3.8/site-packages/tensorflow/python/keras/utils/tf_utils.py in get_reachable_from_inputs(inputs, targets) 138 outputs = x.consumers() 139 else: --> 140 raise TypeError('Expected Operation, Variable, or Tensor, got ' + str(x)) 141 142 for y in outputs: TypeError: Expected Operation, Variable, or Tensor, got block4 ## 4. Expected behavior A clear and concise description of what you expected to happen. Differs from: https://www.tensorflow.org/tutorials/keras/save_and_load and related documentation ## 5. Additional context Include any logs that would be helpful to diagnose the problem. ## 6. System information - OS Platform and Distribution (e.g., Linux Ubuntu 16.04): - Mobile device name if the issue happens on a mobile device: - TensorFlow installed from (source or binary): - TensorFlow version (use command below): - Python version: - Bazel version (if compiling from source): - GCC/Compiler version (if compiling from source): - CUDA/cuDNN version: - GPU model and memory: print(tf.version.GIT_VERSION, tf.version.VERSION) unknown 2.2.0 (installed via conda : in clean environment ("TF_2_2" for purpose) conda install -c anaconda tensorflow-gpu on Ubuntu 18.04.4 LTS Python 3..8.3 nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2017 NVIDIA Corporation Built on Fri_Nov__3_21:07:56_CDT_2017 Cuda compilation tools, release 9.1, V9.1.85 Intel(R) Core(TM) i7-9700 CPU @ 3.00GHz MemTotal: 65886576 kB MemFree: 20100768 kB MemAvailable: 56891796 kB
ravikyram commented 4 years ago

@cattmi

Please, fill issue template..

I am not able to open the link you shared.Request you to share colab link or complete code snippet with supporting files to reproduce the issue in our environment.It helps us in localizing the issue faster.Thanks!

tzekid commented 4 years ago

I have the same problem.

Running on Colab, see notebook below.

To easily reproduce:

  1. open the Training Tutorial Notebook
  2. run all
  3. create a new cell at the end with the following
    detection_model.build((640, 640, 3))  
    tf.keras.models.save_model(detection_model, 'savemodel')

My error message:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-187-08c2245c2768> in <module>()
----> 1 tf.keras.models.save_model(detection_model, 'savedmodel_batch32_1000step')

78 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/utils/tf_utils.py in get_reachable_from_inputs(inputs, targets)
    138       outputs = x.consumers()
    139     else:
--> 140       raise TypeError('Expected Operation, Variable, or Tensor, got ' + str(x))
    141 
    142     for y in outputs:

TypeError: Expected Operation, Variable, or Tensor, got block4
ravikyram commented 4 years ago

I have tried in colab with TF version 2.2 and was able to reproduce the issue.Please, find the gist here.Thanks!

pkulzc commented 4 years ago

The detection_model here is a python instance (SSDMetaArch) instead of Keras model instance, so you can't use .save or keras.save_model to save it.

cattmi commented 4 years ago

Aha..

Many Thanks for promptly resolving ..

Mike

On 15 Jul 2020, at 00:07, pkulzc notifications@github.com wrote:

The detection_model here is a python instance (SSDMetaArch) instead of Keras model instance, so you can't use .save or keras.save_model to save it.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/tensorflow/models/issues/8862#issuecomment-658455968, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHDLGM3RB4IBXPDFW25JDW3R3TQMRANCNFSM4OZFKXIQ.

sambhusuryamohan commented 4 years ago

In the previous tensorflow 1.x object detection api there was an option to export the model for tflite inference after removing the post processing stage. Is there a similar functionality for the new exporter? How to export a model till a specific node? I see that the saved_model when loaded using either tf.saved_model.load or keras.load haven't got the prune function, is there any other similar script to do it?

ravikyram commented 4 years ago

@cattmi

Please, close this thread if your issue was resolved.Thanks!

cattmi commented 4 years ago

@sambhusuryamohan , we should probably start a separate thread somewhere appropriate on resolving saving and loading / export to tf lite of the python coded models as this is important topic, but I will close this thread as 'resolved': i.e. we know model.save can only work with keras implemented, not python, models..

abg3 commented 4 years ago

The detection_model here is a python instance (SSDMetaArch) instead of Keras model instance, so you can't use .save or keras.save_model to save it.

Then how to save SSDMetaArch instead?

itimmisriis commented 4 years ago

Is there seriously no way to export a detection model to TFLite in TF2?

Dontpanice commented 3 years ago

The detection_model here is a python instance (SSDMetaArch) instead of Keras model instance, so you can't use .save or keras.save_model to save it.

Then how to save SSDMetaArch instead?

I would also like to know which steps to take to save the model when we have it as SSDMetaArch instance. Do I need to convert it into a keras instance before saving or is there a way to save it directly as SSDMetaArch instance ?

vermavinay982 commented 3 years ago

You can save your detection model by building it, then using tf saved_model to save it directly. SSDMetaArch Can be saved this way.

detection_model.build((640, 640, 3))  
tf.saved_model.save(detection_model, 'model_name', signatures=None, options=None)
rockstar221123 commented 3 years ago

@vermavinay982 how to load the saved .pb model for this SSDMetaArch and run inference on it? If i use , tf.saved_model.load() and then trying to run inference on the saved dataset , but i am getting below error.

_AttributeError: 'UserObject' object has no attribute 'preprocess'

Nannigalaxy commented 3 years ago

You can save your detection model by building it, then using tf saved_model to save it directly. SSDMetaArch Can be saved this way.

detection_model.build((640, 640, 3))  
tf.saved_model.save(detection_model, 'model_name', signatures=None, options=None)

Can be saved but without signature functions, we can't use the model to run inference. See models/research/object_detection/exporter_lib_v2.py line 278.

vermavinay982 commented 3 years ago

https://github.com/tensorflow/models/blob/master/research/object_detection/colab_tutorials/eager_few_shot_od_training_tflite.ipynb

You can refer above notebook. This notebook gives everything from training to saving models, If you don't require tflite still in output checkpoints will be available. I couldn't find any support for saving SSDMetaArch Models - ckpt_manager should be there as it is there in tflite notebook.

vermavinay982 commented 3 years ago

Since the model is not Keras model, it is SSDMetaArch, So you can try saving the checkpoint and the config file. Build the model and load from the checkpoint to restore the weights. I found this solution, maybe not the best but it works. If you have a better one please share.

# Save new pipeline config
new_pipeline_proto = config_util.create_pipeline_proto_from_configs(configs)
config_util.save_pipeline_config(new_pipeline_proto, '/content/new_config')

exported_ckpt = tf.compat.v2.train.Checkpoint(model=detection_model)
ckpt_manager = tf.train.CheckpointManager(
exported_ckpt, directory="test_data/checkpoint/", max_to_keep=5)
print('Done fine-tuning!')

ckpt_manager.save()
print('Checkpoint saved!')

Loading the model from checkpoint

# Reference for Loading the model from checkpoint
print('Building model and restoring weights for fine-tuning...', flush=True)
num_classes = 1
pipeline_config = '/content/new_config/pipeline.config'
checkpoint_path = 'test_data/checkpoint/ckpt-1'

configs = config_util.get_configs_from_pipeline_file(pipeline_config)
model_config = configs['model']
x = model_builder.build(model_config=model_config, is_training=True)
wayne931121 commented 2 years ago

參考這個網站,可以這樣 https://www.tensorflow.org/lite/guide/signatures

@tf.function(input_signature=[tf.TensorSpec(shape=[None,640,640,3], dtype=tf.float32)])
def  detect(input_tensor):
    preprocessed_image, shapes = detection_model.preprocess(input_tensor)
    prediction_dict = detection_model.predict(preprocessed_image, shapes)
    return detection_model.postprocess(prediction_dict, shapes)

tf.saved_model.save(
    detection_model , 'your_path_to_save_model' ,
    signatures={
      'detect': detect.get_concrete_function()
    })

new_model=tf.saved_model.load( 'your_path_to_save_model')

detections= new_model.signatures[ 'detect' ]( "your_detection_img_tensor" )

eager_few_shot_training.zip

vermavinay982 commented 2 years ago

@wayne931121 It worked, the pb file is generated and it detecting perfectly. Thanks

Mukulareddy commented 2 years ago

You can save your detection model by building it, then using tf saved_model to save it directly. SSDMetaArch Can be saved this way.

detection_model.build((640, 640, 3))  
tf.saved_model.save(detection_model, 'model_name', signatures=None, options=None)

Tried saving the model with tf.saved_model.save() Facing this issue: ValueError: Weights for model ssd_mobile_net_v1_keras_feature_extractor have not yet been created. Weights are created when the Model is first called on inputs or build() is called with an input_shape.

mburaksayici commented 1 year ago

Hi all, I'm trying to use the TF Object Detection API without codes like "exporter_main_tf.py", because I'm building a platform. I don't want to run os.cmd("python exporter_main_tf.py --dir" ) kind of codebase Try wrapping the model around Tensorflow model class and save it with save_model. I've researched on this problem way too long, all github issues and stackoverflow, this is the only way.

import tensorflow as tf
from object_detection.utils import config_util
from object_detection import exporter_lib_v2
import numpy as np

# Load the model configuration and checkpoint
pipeline_config = 'path/to/pipeline.config'
checkpoint_path = 'path/to/checkpoint'
output_directory = 'path/to/save/model'

# Load the pipeline configuration
configs = config_util.get_configs_from_pipeline_file(pipeline_config)

# Build the detection model
model = exporter_lib_v2.build_model(configs['model'], is_training=False)

# Restore the checkpoint
checkpoint = tf.train.Checkpoint(model=model)
checkpoint.restore(tf.train.latest_checkpoint(checkpoint_path))

class DetectionModule(tf.Module):
    def __init__(self, model):
        super(DetectionModule, self).__init__()
        self.model = model

    @tf.function(input_signature=[tf.TensorSpec(shape=[1, None, None, 3], dtype=tf.float32, name='input_image'),
                                ])
    def predict(self, input_image):
        #input_image = tf.cast(input_image, dtype=tf.uint8)  # Convert input image from float to uint8
        preprocessed_image, shapes = model.preprocess(input_image)
        prediction_dict = model.predict(preprocessed_image, shapes)
        detections = model.postprocess(prediction_dict, shapes)
        return detections

model_module = DetectionModule(model)

tf.saved_model.save(model_module, export_dir=model_path)
riyaj8888 commented 1 year ago

`class ItemModel(tf.keras.Model): def init(self, base_model = "https://tfhub.dev/google/nnlm-en-dim50/2", embedding_dimension=32): super().init() self.base_model = hub.KerasLayer(base_model, input_shape=[],trainable=True)#, self.lin = tf.keras.layers.Dense(embedding_dimension) def call(self,inputs):

ip = tf.cast(inputs, tf.string)

    features = self.base_model([inputs])
    # print(features)
    embs = self.lin(features)
    return embs

item_model = ItemModel()`

i m trying to get embedding for text string , on tfds dataset . but tfds dataset return tf.Tensor i am getting following error when i m passing sample/all data to model using data.batch(1).map(item_model)

----------------------------------------------------------------- ValueError Traceback (most recent call last) Cell In[51], line 1 ----> 1 print(item_model(items.batch(1))) File /opt/conda/envs/tf/lib/python3.8/site-packages/keras/utils/traceback_utils.py:70, in filter_traceback.<locals>.error_handler(*args, **kwargs) 67 filtered_tb = _process_traceback_frames(e.__traceback__) 68 # To get the full stack trace, call: 69 #tf.debugging.disable_traceback_filtering()` ---> 70 raise e.with_traceback(filtered_tb) from None 71 finally: 72 del filtered_tb Cell In[49], line 22, in ItemModel.call(self, inputs) 20 def call(self,inputs): 21 # ip = tf.cast(inputs, tf.string) ---> 22 features = self.base_model([inputs]) 23 # print(features) 24 embs = self.lin(features)

File /opt/conda/envs/tf/lib/python3.8/site-packages/tensorflow_hub/keras_layer.py:234, in KerasLayer.call(self, inputs, training) 228 # ...but we may also have to pass a Python boolean for training, which 229 # is the logical "and" of this layer's trainability and what the surrounding 230 # model is doing (analogous to tf.keras.layers.BatchNormalization in TF2). 231 # For the latter, we have to look in two places: the training argument, 232 # or else Keras' global learning_phase, which might actually be a tensor. 233 if not self._has_training_argument: --> 234 result = f() 235 else: 236 if self.trainable: ValueError: Exception encountered when calling layer "keras_layer_13" " f"(type KerasLayer). When inpt_signature is provided, all inputs to the Python function must be convertible to tensors: inputs: ( []) input_signature: ( TensorSpec(shape=(None,), dtype=tf.string, name=None)). Call arguments received by layer "keras_layer_13" " f"(type KerasLayer): • inputs=[''] • training=None`