Origami-Cloudless-AI / TinyMLaaS-2023-winter

Run Hello World of TensorFlow Lite for micro automatically in Docker
https://Origami-TinyML.github.io/tflm_hello_world
Apache License 2.0
1 stars 2 forks source link

WebApp shows camera image & prediction #56

Open doyu opened 1 year ago

doyu commented 1 year ago

The parent issue #45 User stories https://miro.com/app/board/uXjVPwQdIjc=/

Acceptance test (RF)

  1. WebApp takes cam image.
  2. WebApp passes it to TinyML docker.
  3. TinyML docker predicts with passed image.
  4. TinyML docker returns predition.
  5. WebApp shows the original image & predicted label.
  6. WebApp calculates & shows accuracy percentage.
FexbYk23 commented 1 year ago

I can't make progress on this until #55 is complete.

doyu commented 1 year ago

The above 1 might be related to https://github.com/Origami-TinyML/tflm_hello_world/issues/49#issuecomment-1436047019

ArttuLe commented 1 year ago

Was the "x86 simulation docker" meant for this task, since I made the docker file at some point but wasn't sure what task it was related to if any 😅 @doyu

doyu commented 1 year ago

IIUC, this is Sprint 2. In Sprint2 we use x86_sim running in docker as a device. That's all here.

ArttuLe commented 1 year ago

So is this task relevant anymore if/when we'll have the arduino running with the model in this sprint?

doyu commented 1 year ago

Yes. Every operation should be dockerized except ones running inside of device.