I'm having issues loading a model produced by gcp vertex ai:
import tensorflow as tf
tf.saved_model.load(MY_MODEL_PATH)
I'm receiving error:
Traceback (most recent call last):
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/framework/ops.py", line 4177, in _get_op_def
return self._op_def_cache[type]
KeyError: 'DecodeProtoSparseV4'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "test.py", line 6, in <module>
imported = tf.saved_model.load(NORMAL)
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/saved_model/load.py", line 936, in load
result = load_internal(export_dir, tags, options)["root"]
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/saved_model/load.py", line 994, in load_internal
root = load_v1_in_v2.load(export_dir, tags)
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 282, in load
result = loader.load(tags=tags)
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 209, in load
functions = function_deserialization.load_function_def_library(
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/saved_model/function_deserialization.py", line 406, in load_function_def_library
func_graph = function_def_lib.function_def_to_graph(
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/framework/function_def_to_graph.py", line 70, in function_def_to_graph
graph_def, nested_to_flat_tensor_name = function_def_to_graph_def(
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/framework/function_def_to_graph.py", line 239, in function_def_to_graph_def
op_def = default_graph._get_op_def(node_def.op) # pylint: disable=protected-access
File "/home/walter/yin-model/py38/lib/python3.8/site-packages/tensorflow/python/framework/ops.py", line 4181, in _get_op_def
pywrap_tf_session.TF_GraphGetOpDef(self._c_graph, compat.as_bytes(type),
tensorflow.python.framework.errors_impl.NotFoundError: Op type not registered 'DecodeProtoSparseV4' in binary running on rocky. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
I'm having issues loading a model produced by gcp vertex ai:
I'm receiving error:
And running environmnet:
To match the environment that google used to create:
This issue suggests that google knows of a fix which they've added to their docker containers.