Open TheCodez opened 2 years ago
Like Shanqing mentioned at https://github.com/tensorflow/tfjs/issues/1808#issuecomment-557099119, it is not possible to convert tfjs_graph_model to keras_saved_model.
If you have a tfjs_graph_model, you should have a tensorflow model (in keras, tf_hub, tf_saved_model, or tf_frozen_model format, see details here: https://github.com/tensorflow/tfjs/tree/master/tfjs-converter#format-conversion-support-tables) and the tfjs_graph_model was converted from tf modell. A tf.GraphModel can only be created by loading from a model converted from a TensorFlow SavedModel using the command line converter tool and loaded via tf.loadGraphModel().
You can convert the original model into a format you want to use on mobile native platforms.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you.
@rthadur Sorry for the late response. In my case it is not a tf js graph model but tf js layer model. I have found another way to do the converting in the mean time. Maybe a warning could be added that it isn't supported to avoid confusion in the future?
@TheCodez Layers model should be convertible to Keras model, can you share the model that cause the error? Also, I am curious what is the alternative way you succeeded in converting the model?
Hi, @TheCodez
Could you please refer to the above comment by @pyu10055 and also refer to the official documentation for tfjs-converter? Thank you!
@gaikwadrahul8 @pyu10055 sorry for the late response. The model is: https://github.com/FIGLAB/EyeMU/tree/master/flask2/static/models/tfjsmodel4.
If I remember correctly, there was another model in that repo in a different format, which I allowed me to directly convert the model into .onnx without the Keras model step.
Hi, @TheCodez
I apologize for the delayed response and I have endeavored to replicate the reported issue using the latest version of Tensorflow.js 4.14.0
I have observed similar behavior at my end indicating the need for further investigation into this issue and will update you soon. Thank you for your understanding and patience.
CC :@pyu10055
Here is error log output for reference ::
2023-12-07 11:20:09.825253: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-12-07 11:20:09.825350: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-12-07 11:20:09.827697: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-12-07 11:20:11.925390: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
WARNING:tensorflow:From /usr/local/lib/python3.10/dist-packages/keras/src/layers/normalization/batch_normalization.py:883: _colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
Traceback (most recent call last):
File "/usr/local/bin/tensorflowjs_converter", line 8, in <module>
sys.exit(pip_main())
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 958, in pip_main
main([' '.join(sys.argv[1:])])
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 962, in main
convert(argv[0].split(' '))
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 948, in convert
_dispatch_converter(input_format, output_format, args, quantization_dtype_map,
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 682, in _dispatch_converter
dispatch_tensorflowjs_to_keras_h5_conversion(args.input_path,
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/converter.py", line 329, in dispatch_tensorflowjs_to_keras_h5_conversion
model = keras_tfjs_loader.load_keras_model(config_json_path)
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/keras_tfjs_loader.py", line 299, in load_keras_model
return _deserialize_keras_model(config_json['modelTopology'],
File "/usr/local/lib/python3.10/dist-packages/tensorflowjs/converters/keras_tfjs_loader.py", line 95, in _deserialize_keras_model
weights_list.append(weights_dict[shorten_name])
KeyError: 'kernel'
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
Describe the current behavior I'm trying to convert the tensorflow js layer model from this repo to Keras. However, when executing the script I get a
KeyError: 'kernel'
The layer causing the issue seems to be
conv2d_1_1/kernel
Describe the expected behavior The model gets converted correctly without crashing.
Standalone code to reproduce the issue
tensorflowjs_converter --input_format tfjs_layers_model --output_format keras tfjsmodel4/my-model.json my_keras/
Other info / logs