Origami-Cloudless-AI / TinyMLaaS-2023-winter

Run Hello World of TensorFlow Lite for micro automatically in Docker
https://Origami-TinyML.github.io/tflm_hello_world
Apache License 2.0
1 stars 2 forks source link

tflm_hello_world

GHA workflow
badgecodecov

Install

pip install tflm_hello_world

Instructions

Service

Run “make app” in the root directory to build and run all the required containers for the service

make app

Bridging server

If the end-device doesn’t have Wifi-support, a bridging device is needed.

Expose port 5000 for the bridging device to work with outside sources.

ngrok http 5000

Copy the exposed IP-address into the streamlit service

Start up the bridging server

python main.py

Working hours tracking

Click Here