tensorflow / tfjs

A WebGL accelerated JavaScript library for training and deploying ML models.
https://js.tensorflow.org
Apache License 2.0
18.35k stars 1.92k forks source link

[tfjs-tflite] How to use a particular a subgraph of a tflite model? #6919

Open josephrocca opened 1 year ago

josephrocca commented 1 year ago

This tflite file has two subgraphs: encode, and decode. If I load the model with tflite.loadTFLiteModel and then call model.predict, it uses the decode subgraph by default (this subgraph is also shown by default when loading the model in netron.app).

I'm wondering how to use the encode subgraph? I couldn't find any docs on this, and after sleuthing through the tfjs-tflite code I wasn't able to find any config options.

I initially thought that I could just refer to the inputs/outputs of the subgraph, but that only works for the input/output of the decode subgraph:

// works:
let output = model.predict({"decode_encoding_indices:0":bits});
// doesn't work (throws error, shown below):
let output = model.predict({"encode_input_frames:0":embedding, "encode_num_quantizers:0":numQuantizers}); 

The error thrown by the latter is:

Uncaught Error: The model input names don't match the model input names. Names in input but missing in model: [encode_input_frames:0,encode_num_quantizers:0]. Names in model but missing in inputs: [decode_encoding_indices:0].
    at TFLiteModel.checkMapInputs (tf-tflite.js:10318:19)
    at TFLiteModel.predict (tf-tflite.js:10159:22)
    at quantize ((index):53:36)
    at testQuantization (<anonymous>:6:24)
    at <anonymous>:1:7

I've double-checked that I've got the input names correct.

gaikwadrahul8 commented 1 year ago

Hi, @josephrocca

Thank you for opening this issue. Since this issue has been open for a long time, the code/debug information for this issue may not be relevant with the current state of the code base.

The TFJs team is constantly improving the framework by fixing bugs and adding new features. We suggest you try the latest TFJs version with the latest compatible hardware configuration which could potentially resolve the issue. If you are still facing the issue, please create a new GitHub issue with your latest findings, with all the debugging information which could help us investigate.

Please follow the release notes to stay up to date with the latest developments which are happening in the Tensorflow.js space.

Thank you for your support and cooperation.

josephrocca commented 1 year ago

@gaikwadrahul8 This issue is still relevant.

gaikwadrahul8 commented 11 months ago

Hi, @josephrocca

I sincerely apologize for the inconvenience, could you please help me with steps to replicate the same issue from my end and this This tflite file link is broken

I would greatly appreciate it if you could provide me with a code-snippet/CodePen example or GitHub repository link with steps to reproduce the issue on my end. Thank you for your time and patience!

josephrocca commented 11 months ago

Apologies - here's the new link: https://huggingface.co/rocca/lyra-v2-soundstream/blob/main/tflite/1.3.0/quantizer.tflite and I've updated the original post with that.

Here's some example code where I've broken the quantizer.tflite file into two separate files: https://github.com/josephrocca/lyra-v2-soundstream-web/blob/main/tflite-simple.html

I don't know the equivalent code to use which would just use the above quantizer.tflite, and access the two subgraphs within it. I don't think this feature currently exists, hence this issue is likely a feature request.

surajpandey353 commented 1 month ago

I would like to follow up on this issue.

I recently discovered that if I have two disjoint subgraphs in the single tflite graph, its good in terms Memory Consumption and Speed, because the XNNPACK is probably initialized once. I am really interested if we can invoke a subgraph from tflite model and get the outputs of subgraph.