JasonDCox / ML-Mentorship-GovSchool

0 stars 0 forks source link

Convert Desktop Tensorflow Models to lighter versions and run them on the Jetson #30

Closed brandonC1234 closed 2 years ago

brandonC1234 commented 2 years ago

Description

This involves taking the tensorflow models from the desktop and converting them to either TensorRT, TFLite, or TensorJS and transferring it by usb. I will then have to create a program on the Jetson to run them with input from the Webcam.

Acceptance Criteria

References

brandonC1234 commented 2 years ago

Progress

Was able to get model to work on the jetson by switching to TFLite. I was able to achieve an fps in the mid 3 fps on a 320x320 image size and 0.8 fps on my more detailed and accurate 640x640 image size model.

Future Improvements

From what I can see, TensorRT destroys TFLite in performance metrics because TFLite has limited GPU support while TensorRT only runs on the GPU. This warrants further effort into getting TensorRT to work, but since most of the work around it is based in C and C++, I wouldn't have very high hopes.

Outside sources

Here's a research paper about it, but since the models are run on a 2080, the different in probably much greater than what it would be on the Jetson.

brandonC1234 commented 2 years ago

Progress

I finished the TFLite implementation and was able to get meaningful detection boxes. There was an issue with the developers changing the order of results in the latest Tensorflow version, but I was finally able to get around it.

Results

The general animal detection model achieves decent detection at a frame rate of 0.865 fps, but it does seem to think some dogs are the same as cats. While the FPS is low, it is only using about 200 MB of RAM, so I can deploy both the general and personal model at the same time with hopefully little RAM allocation issues.