Closed hpssjellis closed 4 years ago
Thanks for submitting this! I shall pass this on to the team. For any other folk interested in this please do leave your thoughts below too.
Additional information:
Here is a link of how I change a TFJS model.json and model.weights.bin files to an arduino style C header file.
expect both tf-nightly tensorflowjs to be installed.
tensorflowjs_converter --input_format=tfjs_layers_model --output_format=keras_saved_model ./model.json ./
tflite_convert --keras_model_file ./ --output_file ./model.tflite
xxd -i model.tflite model.h
The above commands are explained here
@hpssjellis As you have shown earlier, TFJS layers model can be converted to TF saved model and further converted to TFLite model. But TFJS graph model conversion is not reversible. You need to go back the original TF saved model to convert that to TFLite model.
@pyu10055 Should I close this request or is a javascript direct conversion from TFJS to TFLite a possibility? The command line instructions I posted above are very easy to use, however they do take installing. Which might be a negative for many Javascript users.
pip install tf-nightly
pip install tensorflowjs
pip install netron
Then these commands (netron is used to view the .tflite file or other models but presently does not view the model.h file)
tensorflowjs_converter --input_format=tfjs_layers_model --output_format=keras_saved_model ./model.json ./
tflite_convert --keras_model_file ./ --output_file ./model.tflite
xxd -i model.tflite model.h
netron model.tflite
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 dyas if no further activity occurs. Thank you.
Closing as stale. Please @mention us if this needs more attention.
Please make the ability to convert TensorflowJS models into TensorflowLite (flatbuffers?) so that they can be used on micro-controllers such as the $100 Dual Core Arduino Portenta H7 .
I have a working sine function model that loads onto both cores here and runs much faster than TensorflowLite on most boards (I needed to add a delay of 3 microseconds per loop for it even to be visible).
TensorflowJS is easier to work with than other forms of Tensorflow. Micro-controllers have a huge hobby and professional following. The ability to convert directly between TensorflowJS and TensorflowLite would dramatically simplify the process of using smaller specific machine learning models.
Note: The output should be in a ready made model.h file not the regular .tflite file see the explanation here. Look for: I will save everything in just a .h file. That way the file is ready to upload immediately to a micro-controller. (Probably a good idea to give the user the option to also save as a .tflite file)
Reasons to have TensorflowJS convert directly to TensorflowLite:
Tensorflowjs on a webpage is very similar to a mobileApp. Testing your models on TensorflowJS would be the fastest way to test and make changes before converting the model to a mobile device.
TensorflowJS to TensorflowLite for micro-controllers allows programming without learning Java or Python. Micro-controller C++ is reasonably simple.
The dual core PortentaH7 has C-USB out to video, connections for WebCams and Bluetooth to keyboard or mouse. This makes it similar to how TensorflowJs interacts with ML models.
As TensorflowLite only has a subset of the functions of Tensorflow, being able to quickly test if your TensorflowJS model can be converted to TensorflowLite would be an asset.
Can anyone think of any other reasons?
P.S. I have done a fair bit of making TensorflowJS simpler for High School students by converting examples and demos into single file vanilla Javascript here.