Open mohit-av opened 1 year ago
Any update?
Hi, @mohit-av
Apologize for the delayed response and I tried to replicate the same issue from my end and I'm also getting the same error message, for your reference I have added gist-file and error log output for reference so we'll have to dig more into this issue and will update soon.
Thank you for bringing this issue to our attention, I really appreciate your valuable time and efforts. Thank you!
2023-10-03 14:28:10.057660: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-10-03 14:28:10.057730: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-10-03 14:28:10.057784: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-10-03 14:28:11.134434: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
2023-10-03 14:28:14.745358: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:47] Overriding orig_value setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/ops.py", line 3022, in op_def_for_type
return self._op_def_cache[type]
KeyError: 'CreateRangeDecoder'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/tensorflowjs_converter", line 8, in <module>
sys.exit(pip_main())
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 958, in pip_main
main([' '.join(sys.argv[1:])])
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 962, in main
convert(argv[0].split(' '))
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 948, in convert
_dispatch_converter(input_format, output_format, args, quantization_dtype_map,
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 654, in _dispatch_converter
tf_saved_model_conversion_v2.convert_tf_saved_model(
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 982, in convert_tf_saved_model
_convert_tf_saved_model(output_dir, saved_model_dir=saved_model_dir,
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 759, in _convert_tf_saved_model
model = _load_model(saved_model_dir, saved_model_tags_list)
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 583, in _load_model
model = load(saved_model_dir, saved_model_tags)
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/saved_model/load.py", line 900, in load
result = load_partial(export_dir, None, tags, options)["root"]
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/saved_model/load.py", line 1031, in load_partial
loader = Loader(object_graph_proto, saved_model_proto, export_dir,
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/saved_model/load.py", line 161, in __init__
function_deserialization.load_function_def_library(
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/saved_model/function_deserialization.py", line 456, in load_function_def_library
func_graph = function_def_lib.function_def_to_graph(
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/function_def_to_graph.py", line 91, in function_def_to_graph
graph_def, nested_to_flat_tensor_name = function_def_to_graph_def(
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/function_def_to_graph.py", line 330, in function_def_to_graph_def
op_def = default_graph.op_def_for_type(node_def.op) # pylint: disable=protected-access
File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/ops.py", line 3025, in op_def_for_type
self._op_def_for_type(type)
RuntimeError: Op type not registered 'CreateRangeDecoder' in binary running on 9d80d41b0845. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g. `tf.contrib.resampler`), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
I'm using Tensorflow Compression(TFC) for building a model. I'm using it to encode an image to compreesed_image/string format using TFC
analysis (or encoder)
model. I'm planning to send this compressed_image/string from backend to browser. Then, I will decode it back to*.png
using TFCentropy
andsynthesis (or decoder)
model on the browser. So,Now, I want to convert my decompressor model to
tfjs
format so, I can use it in the browser. But when I'm converting the decompressor totfjs
, I'm running intoops
issue. There are some customsops
which are being used by TFC which are creating the conversion problem. Here is the list. I'm looking forward for the support of theseops
. Currently, I'm more interested in theops
related to decoder only. Any help in term of providing support for theseops
or guiding how can I do myself would be appreciated.Here is the link to colab notebook where I was trying to convert
tf_model
totfjs
.This is the issue:
System information
4.2.0