tensorflow / tfjs

A WebGL accelerated JavaScript library for training and deploying ML models.
https://js.tensorflow.org
Apache License 2.0
18.5k stars 1.93k forks source link

Make tflite model run directly with tfjs #991

Closed rajeev-tbrew closed 5 years ago

rajeev-tbrew commented 5 years ago

To get help from the community, check out our Google group. This is not a bug rather a feature request. As TensorFlow library provides TFLite models to run on Android, iOS platform, can we a build a tfjs wrapper to allow tfjs to directly load TFlite model in browser. This will allow same model being used across multiple platforms.

TensorFlow.js version

Browser version

Describe the problem or feature request

Code to reproduce the bug / link to feature request

nsthorat commented 5 years ago

I don't think this is something we'll ever support -- you can convert a savedmodel to tfjs without tf lite.

rthadur commented 5 years ago

closing this as there is no support for feature request.

mukeshmithrakumar commented 5 years ago

Question @mbostock and @rthadur, this is my first time working with javascript, so, let me know if I am way off, js is a client-side programming language right and tf.lite works on edge, so, why not have a way to load tf.lite directly via tfjs, seems pretty straightforward.

nsthorat commented 5 years ago

Unfortunately it’s not as simple as you’d think. Is there a use case you have a in mind that we can’t solve with TensorFlow.js itself?

mukeshmithrakumar commented 5 years ago

oh no, @nsthorat it's more of a architecture & performance decision, we have a voice assistant in tensorflow and to move it to edge there are two ways, convert it to tf.lite and write in java for android or convert it into tf.js and write in js, I read tf.lite has much better performance for ML but writing in js would be convenient, not to mention, having to learn Java to do this, so was wondering if I could convert from tf to tf.lite and then load the model via js. If you have any ideas, it would be really helpful

nsthorat commented 5 years ago

Got it.

In the browser, compiling TF Lite will be too large of a binary to run directly and will be much slower than our kernels.

In Node.js, we're working on natively executing SavedModels, and we are also working on headless WebGL bindings.

These two should cover everything you'd need, unless I'm missing something!

mukeshmithrakumar commented 5 years ago

Hey @nsthorat thanks a lot man, was going through tf node.js and it looks like that would work great. Appreciate your help 🙂

nsthorat commented 5 years ago

No problem :)

opiepj commented 5 years ago

@nsthorat Have a use case that doesn't require a wrapper. Let me know if this is possible.

Question: Is it possible to convert a .tflite file to a .pb file then use tfjs-converter to convert to a js model?

Context: I'm using Firebase's AutoML library -- it's pretty sweet, it creates generic .tflite files (TensorFlow lite models) for me I can plug into my native mobile app (I don't have access to a .pb output).

Reading tfjs-converter, maybe I can convert a tf saved model into a js model format offline. However, doesn't seem you can convert .tflite to json.

nsthorat commented 5 years ago

Hi @opiepj, we're actively working on getting TensorFlow.js models directly supported in AutoML, stay tuned!

no-1ne commented 5 years ago

https://github.com/intel/webml-polyfill this is a way of running tflite on web

nsthorat commented 5 years ago

@startupgurukul unfortunately that requires a custom chromium build.

no-1ne commented 5 years ago

Hi Nikhil, My understanding is that polyfill is enough to run in any browser and custom chromium build if one needs native performance.

Here are some examples they provide https://intel.github.io/webml-polyfill/examples/

bernardoespinosa commented 4 years ago

@rthadur I've checked the examples, it looks like i've to write lot of javascript code like this to work with smart reply https://github.com/intel/webml-polyfill/blob/master/examples/face_recognition/utils.js?

NathanBWaters commented 4 years ago

@nsthorat what should we do if we want the benefits of a fast tflite model but we want to run it on the web?

It doesn't seem possible to convert a .tflite model back into a Tensorflow model so that we can convert it into a TFjs model.

nsthorat commented 4 years ago

@NathanBWaters where did you get the .tflite model? .tflite models come from SavedModels which can be converted to tfjs.

NathanBWaters commented 4 years ago

@nsthorat thanks for responding so quickly. We are seeing faster results using a tflite blazeface model (see here) than using the TF model converted into TFjs.

So, though we can convert the original SavedModels into tfjs, we want to use the faster more optimized .tflite version of the model.

nsthorat commented 4 years ago

@NathanBWaters We actually already have blazeface converted in tfjs: https://github.com/tensorflow/tfjs-models/tree/master/blazeface

Hope this helps!

rrjanbiah commented 4 years ago

@NathanBWaters We actually already have blazeface converted in tfjs: https://github.com/tensorflow/tfjs-models/tree/master/blazeface

@nsthorat Will it be possible for you to share how you did the conversion?

nsthorat commented 4 years ago

@rrjanbiah We converted it from the internal SavedModel (not sure why the team didn’t want to release it).

rrjanbiah commented 4 years ago

@nsthorat Thank you. I see couple of good tflite models. So if we know the process of conversion it will be helpful.

NathanBWaters commented 4 years ago

@nsthorat thanks for the link, we are currently using the tfjs blazeface model. However, we're seeing about 15fps for the tfjs model and 19fps for the tflite version. This makes us think that the tflite version of the model is faster.

So the question is still: is it possible to used the highly optimized .tflite models through tfjs? Or is there any way to use .tflite on the web?

We're asking because we have the following assumption that .tflite is more optimized than tfjs and our project requires the models to be as fast as possible. If our assumption is not correct, please let us know!

Also note that Blazeface is not the only .tflite model we want to use on the web.

nsthorat commented 4 years ago

How are you comparing the two -- e.g. one in a browser, one natively on the same computer? That unfortunately isn't a fair comparison because we can't run native code from the web. Are you using the WebGL backend or the WASM backend?

We've done deep web-specific optimizations on our version so it should be as fast as it's going to be. Try using the WASM backend as well.

Do you have .tflite models that you created? Where are the other .tflite models from?

NathanBWaters commented 4 years ago

@nsthorat here's how we're comparing the two: We're running both on our phone in the browser. One is using tfjs and is getting 15fps. When you remove the tfjs model, it's 60fps. We are using the WASM backend.

The other is on https://viz.mediapipe.dev/demo/face_detection. It looks like they compiled the tflite interpreter to wasm in order to run the .tflite file on the browser. It's getting 19 fps.

This is certainly an imperfect benchmark for the following reasons: 1) We're not sure how similar the two models are. Just because they're both named blazeface doesn't mean they're exactly the same. 2) We're not sure if the tfjs model is quantized https://tfhub.dev/tensorflow/blazeface/1

Please share your own benchmarks when you can because I've struggled to find great, in-depth performance comparisons between TF, TFJS, and tflite.

Essentially we're deciding between two options when deploying a model to the web: 1) Use tfjs 2) Compile the tflite interpreter to wasm and run .tflite models on the web.

Also, @nsthorat thanks in general for responding! It's greatly appreciated.

Shaiken commented 4 years ago

@NathanBWaters so...https://github.com/intel/webml-polyfill is final soluction? I am also looking for a solution recently Also want to apply it to cordova

alexcannan commented 4 years ago

One is using tfjs and is getting 15fps. When you remove the tfjs model, it's 60fps.

@NathanBWaters could you elaborate on this? What model are you using when you're getting 60 fps? Is this just the fps of your graphics engine w/o any model?

I'm facing a similar problem where I'm not sure whether to use a WASM TFLite interpreter or run it directly with TFJS. We're working with pretty tight performance and size constraints too.

NathanBWaters commented 4 years ago

We're doing this work for the blazeface and facemesh model. The 60fps was just in reference to how our application runs at max frames without the model. Fortunately they just dropped the facemesh model in tflite this Monday.

leoffx commented 4 years ago

I have a model with Custom Layers trained on Python, that can't be converted to JS and I wish to use it on Android devices through Ionic. Is https://github.com/intel/webml-polyfill still the best way to do it?

Rcuz8 commented 4 years ago

I'm not sure why this is a closed issue. For Firebase users, the only natively supported export format seems to be .tflite (via the ML package), but this is an unusable format for a Web-facing frontend. I see that the common recommendation thus far is to use https://github.com/intel/webml-polyfill but firstly I couldn't find a way within that documentation to construct a model from a saved .tflite file and secondly obviously using tfjs would be preferred as it's already familiar.

wangtz commented 3 years ago

Running TFLite model in TFJS is supported now. Feel free to check out this video: https://www.youtube.com/watch?v=5q8BzYN4rqA

NathanBWaters commented 3 years ago

@wangtz Incredible, thanks for the update!