tensorflow / tfjs

A WebGL accelerated JavaScript library for training and deploying ML models.
https://js.tensorflow.org
Apache License 2.0
18.38k stars 1.92k forks source link

Converted Model Using TensorFlow.JS Converter Not Working #8358

Closed TheRealCasmat closed 1 month ago

TheRealCasmat commented 1 month ago

Hey There!

I was following the TFJS WebML YT Course from Jason Mayes and following along to the TFJS Converter video, here's my notebook: Google Colab Notebook

Basically it is an exact replica from the video and with a few extra warnings here and there, the model files were generated: MobileNetV2.zip

On my site, I used TFJS with tf.loadLayersModel('URL OF MODEL.JSON') but an error was given saying an InputLayer should have been passed either a batchInputShape or an inputShape. There is a batch_shape in the model.json though. TFJS is up to date, no clue whats going on.

I am trying to convert a MobileNetV3-Large actually but the same thing happened too and I was left with this (using same notebook as earlier but replacing tf.keras.applications.MobileNetV2 with tf.keras.applications.MobileNetV3Large, but got same results as earlier): MobileNetV3-Large.zip

Any help appreciated! This is probably just me stupid as I'm learning TF/TFJS and ML in general, so sorry in advance!

Have a good one, -- @TheRealCasmat

shmishra99 commented 1 month ago

Hi @TheRealCasmat ,

I have reproduced the error you are experiencing.

The error Error loading model: Error: An InputLayer should be passed either a batchInputShape or an inputShape indicates that the TensorFlow.js converter cannot determine the expected input shape for your model. This is because model.save() does not save the InputLayer explicitly, only the model architecture and weights.

You can use the following code snippet to save your model:

import os
import tensorflow as tf
model = tf.keras.applications.MobileNetV2(
  input_shape=(224, 224, 3), weights='imagenet', classifier_activation='softmax'
)
tf.saved_model.save(model, 'sample_data/tf_model')

This will generate a graph model file. You can then convert your model to TensorFlow.js using:

tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model sample_data/tf_model/ sample_data/tfjs_model

You can then use this model like this:

async function loadModel() {
  const model = await tf.loadGraphModel('sample_data/tfjs_model/model.json')
  console.log("model...!!", model)
}
loadModel()

Output:

image

The documentation provides more detailed information..

Let me know if this helps. Thanks You!!

TheRealCasmat commented 1 month ago

Hey @shmishra99!

Thanks for your reply, but I was looking for a way to get a layers model as output for transfer learning. How could I modify this to do that?

Sorry for the confusion! -- @TheRealCasmat

gaikwadrahul8 commented 1 month ago

Hi, @TheRealCasmat

I tried to replicate the same behavior from my end and I'm able to replicate the same behavior with your provided colab notebook but I see after installing the latest TensorFlow.js version, it's installing the keras version 3 and as far I know tfjs_converter compatible with keras 2 version at the moment so I downgraded the TensorFlow version to 2.15.0 in which it will download keras 2.15.0 versionwhich compatible for tfjs_converter and I converted model to TensorFlow.js format, it's working as expected please refer this gist-file , Please give it try from your end and it should work.

If issue still persists please let us know with error log to investigate this issue further from our end.

For your reference I have added screenshot below :

image

Thank you for your cooperation and patience.

TheRealCasmat commented 1 month ago

Hey!

Sorry for the delayed response, but MobileNetV3Large cannot be converted with the gist-file you provided. All i changed was MobileNetV2 to MobileNetV3Large When doing so, the console returns:

Uncaught (in promise) Error: Unknown layer: Rescaling. This may be due to one of the following reasons:
1. The layer is defined in Python, in which case it needs to be ported to TensorFlow.js or your JavaScript code.
2. The custom layer is defined in JavaScript, but is not registered properly with tf.serialization.registerClass().
    at rD (generic_utils.js:243:13)
    at sM (serialization.js:31:10)
    at u (container.js:1206:11)
    at t.fromConfig (container.js:1234:7)
    at rD (generic_utils.js:278:11)
    at sM (serialization.js:31:10)
    at models.js:295:7
    at c (runtime.js:63:40)
    at Generator._invoke (runtime.js:293:22)
    at Generator.next (runtime.js:118:21)

Thanks! -- @TheRealCasmat

shmishra99 commented 1 month ago

Hi @TheRealCasmat,

I successfully converted the MobileNetV3Large model using tfjs_converter. Please follow the instructions in this gist to replicate the process.

Let me know if it works for you.

Thank you!

TheRealCasmat commented 1 month ago

Thanks @shmishra99!! Been stuck for days but now able to move on with my project! Thanks @gaikwadrahul8 for your help as well!

Thanks, -- @TheRealCasmat

google-ml-butler[bot] commented 1 month ago

Are you satisfied with the resolution of your issue? Yes No