tensorflow / tfjs

A WebGL accelerated JavaScript library for training and deploying ML models.
https://js.tensorflow.org
Apache License 2.0
18.44k stars 1.92k forks source link

Error: Unknown layer: Functional. This may be due to one of the following reasons #3786

Closed KailiangGu closed 4 years ago

KailiangGu commented 4 years ago

I am new to the TensorFlow and JS. I was trying to convert the MobileNet V2 model from keras to tensorflow.js When I try to load model into JS, I get the following error:

`errors.ts:48 Uncaught (in promise) Error: Unknown layer: Functional. This may be due to one of the following reasons:

  1. The layer is defined in Python, in which case it needs to be ported to TensorFlow.js or your JavaScript code.
  2. The custom layer is defined in JavaScript, but is not registered properly with tf.serialization.registerClass(). at new e (errors.ts:48) at Rp (generic_utils.ts:242) at cd (serialization.ts:31) at e.fromConfig (models.ts:942) at Rp (generic_utils.ts:277) at cd (serialization.ts:31) at models.ts:300 at common.ts:14 at Object.next (common.ts:14) at a (common.ts:14)`

I construct my model with a MobileNet V2 pre-trained model and add two layers on top of it. And I convert the model using tensorflowjs_converter with a saved h5 file of the model. Here is my model summary. The mobilenetv2_1.00_160 with a type of functional and I believed is the cause of the error.

Model: "sequential" Layer (type)// Output Shape// Param #
######################################################### mobilenetv2_1.00_160 (Functi (None, 5, 5, 1280)// 2257984
######################################################### global_average_pooling2d (Gl (None, 1280) 0
#########################################################

dense (Dense) (None, 3755) 4810155
######################################################### Total params: 7,068,139 Trainable params: 7,034,027 Non-trainable params: 34,112 #########################################################

lospericos commented 4 years ago

I had an issue like this albeit im using MobileNet not v2. I don't have the specific link, but a stack overflow thread somewhere said to go into the public/model.json folder and ctrl+f "Functional" and replace the word with "Model". worked for me!

KailiangGu commented 4 years ago

Replace the "Functional" with "Model" indeed solve this error. However, it also gives me the new error as following:

Uncaught (in promise) Error: computeMask has not been implemented for Merge yet

rthadur commented 4 years ago

@KailiangGu @lospericos I was able to reproduce the same for my model conversion, @pyu10055 can you please assist with this bug ?

pyu10055 commented 4 years ago

@KailiangGu have you tried to convert the HDF5 model to TFJS graph model?

starrabb1t commented 4 years ago

The same issue with Keras MobileNet v1. Replacing the word with "Model" worked for me, thanks!

rthadur commented 4 years ago

@pyu10055 i tried to convert with TFJS graph model and no issues , i used the command !tensorflowjs_converter --input_format keras --output_format=tfjs_graph_model my_sample_model.h5 model/ @KailiangGu @starrabb1t can you please try using TFJS Graph model

google-ml-butler[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 dyas if no further activity occurs. Thank you.

google-ml-butler[bot] commented 4 years ago

Are you satisfied with the resolution of your issue? Yes No

carlosdelamora commented 3 years ago

I had an issue like this albeit im using MobileNet not v2. I don't have the specific link, but a stack overflow thread somewhere said to go into the public/model.json folder and ctrl+f "Functional" and replace the word with "Model". worked for me!

Where is this public/model.json folder located?

grantbrewster commented 3 years ago

Hey everyone, I am running into a slight variation of this problem.

This error is now throwing on a layer called KerasLayer, and when trying to convert that to Model, it says that it can not iterate an undefined object :/

What I have done:

  1. Trained a MobileNetV2 keras model from the tf.keras.applications.MobileNetV2
  2. Saved that using tfjs.converters.save_keras_models(model, path)
  3. Then loaded those paths in tfjs using tf.loadLayersModel(path_to_model)

Now every time I try to load that model I get the error:

`runtime.js:728 Uncaught (in promise) Error: Unknown layer: KerasLayer. This may be due to one of the following reasons:

  1. The layer is defined in Python, in which case it needs to be ported to TensorFlow.js or your JavaScript code.
  2. The custom layer is defined in JavaScript, but is not registered properly with tf.serialization.registerClass(). at jN (VM761 tf.min.js:17) at GI (VM761 tf.min.js:17) at e.fromConfig (VM761 tf.min.js:17) at jN (VM761 tf.min.js:17) at GI (VM761 tf.min.js:17) at VM761 tf.min.js:17 at u (VM761 tf.min.js:17) at Generator._invoke (VM761 tf.min.js:17) at Generator.forEach.t. [as next] (VM761 tf.min.js:17) at Wm (VM761 tf.min.js:17)`

When I try and replace KerasLayer with Model in model.json I get: runtime.js:728 Uncaught (in promise) TypeError: undefined is not iterable (cannot read property Symbol(Symbol.iterator)) at Zm (VM791 tf.min.js:17) at e.fromConfig (VM791 tf.min.js:17) at jN (VM791 tf.min.js:17) at GI (VM791 tf.min.js:17) at e.fromConfig (VM791 tf.min.js:17) at jN (VM791 tf.min.js:17) at GI (VM791 tf.min.js:17) at VM791 tf.min.js:17 at u (VM791 tf.min.js:17) at Generator._invoke (VM791 tf.min.js:17)

This is what the top part of my model.json looks like: {"format": "layers-model", "generatedBy": "keras v2.4.0", "convertedBy": "TensorFlow.js Converter v3.6.0", "modelTopology": {"keras_version": "2.4.0", "backend": "tensorflow", "model_config": {"class_name": "Sequential", "config": {"name": "sequential_3", "layers": [{"class_name": "InputLayer", "config": {"batch_input_shape": [null, 224, 224, 3], "dtype": "float32", "sparse": false, "ragged": false, "name": "keras_layer_2_input"}}, {"class_name": "KerasLayer", "config": {"name": "keras_layer_2", "trainable": false, "batch_input_shape": [null, 224, 224, 3], "dtype": "float32", "handle": "https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/4"}}, {"class_name": "ReLU", "config": {"name": "re_lu", "trainable": true, "dtype": "float32", "max_value": null, "negative_slope": 0.0, "threshold": 0.0}}, {"class_name": "Dropout", "config": {"name": "dropout_2", "trainable": true, "dtype": "float32", "rate": 0.5, "noise_shape": null, "seed": null}}, {"class_name": "Dense", "config": {"name": "dense_3", "trainable": true, "dtype": "float32", "units": 4, "activation": "linear", "use_bias": true, "kernel_initializer": {"class_name": "GlorotUniform", "config": {"seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}]}}, "training_config": {"loss": {"class_name": "SparseCategoricalCrossentropy", "config": {"reduction": "auto", "name": "sparse_categorical_crossentropy", "from_logits": true}}, "metrics": [[{"class_name": "MeanMetricWrapper", "config": {"name": "acc", "dtype": "float32", "fn": "sparse_categorical_accuracy"}}]],

I've spent hours debugging this, and now struggling with another bug once I re-wrote another classification model from scratch and now can't get a prediction without this error: Error: Error when checking model : the Array of Tensors that you are passing to your model is not the size the the model expected. Expected to see 1 Tensor(s), but instead got 0 Tensors(s).

It seems there is some bug in the tfjs keras converter?

raffizulvian commented 3 years ago

@grantbrewster have you try to save it into SavedModel format and then convert it to Graph Model? I'm facing the same issue before, I found an alternate solution for this. Instead of saving your model in Keras (.h5) format, save your model into TensorFlow SavedModel format.

model.save('./path/to/save/model')

After that, you can convert the model to TensorFlow.js format using the command line.

$ tensorflowjs_converter --input_format=tf_saved_model \
                         --output_node_names='MobilenetV3/Predictions/Reshape_1' \
                         --saved_model_tags=serve \
                         ./input/savedmodel/path \
                         ./output/destination/path

Then load your model using loadGraphModel in the javascript.

const model = await tf.loadGraphModel(url);

This approach works fine for me. But you need to consider that this method doesn't give you the ability to fit your model in javascript and only works for inference.

Further reading: TensorFlow tutorial

vcucu commented 3 years ago

If the problem persists after the fix with the replacement of "Functional" with "Model", you probably have tensorflowjs 3.5.0 or higher (most recent is 3.6.0 at the moment). Downgrading to 3.4.0 helps to resolve the issue.

TimWue commented 2 years ago

Got the same error today. The problem was that I imported an old version of tensorflowjs by using cdn (tensorflowjs documentation was not updated). See here, what version can be used within the script tag of your cdn import. For me 3.19.0 worked.