Running the app depends on access to the camera hardware. This takes up time and resources by requiring that functional tests run on the Engineer Model (EM) flatsat.
Solution
We should be able to run the app with a test flag that substitutes image acquisition using the camera hardware with a mock image acquisition that just copies and pastes .ims_rgb and .png files from a test image directory int the app's root directory. A single image per default label should be enough with a mechanism that just renames the image files based on the expected timestamp.
Additionally
Downsampling and conversions from PNG to JPEG by pngtopam, pamscale, and pnmtojpeg will also have to be mocked by copy and pasting sample _thubmnail.jpeg and _input.jpeg files.
Todo
[x] Implement mock objects (e.g. MockHDCamera with test images) to run the app in a local dev environment
[x] Implement a test/debug mode support for TF Lite inference.
[x] Implement a test/debug mode support for K_Means image clustering.
[x] Create requirements.txt to run app in local dev Python VM.
[x] Log execution time when running external binaries.
Problem
Running the app depends on access to the camera hardware. This takes up time and resources by requiring that functional tests run on the Engineer Model (EM) flatsat.
Solution
We should be able to run the app with a test flag that substitutes image acquisition using the camera hardware with a mock image acquisition that just copies and pastes .ims_rgb and .png files from a test image directory int the app's root directory. A single image per default label should be enough with a mechanism that just renames the image files based on the expected timestamp.
Additionally
Downsampling and conversions from PNG to JPEG by
pngtopam
,pamscale
, andpnmtojpeg
will also have to be mocked by copy and pasting sample _thubmnail.jpeg and _input.jpeg files.Todo