TexasInstruments / edgeai-benchmark

This repository has been moved. The new location is in https://github.com/TexasInstruments/edgeai-tensorlab
https://github.com/TexasInstruments/edgeai
Other
3 stars 0 forks source link

YOLOv5 Ti Lite Custom Model Compilation Process #9

Open dpetersonVT23 opened 2 years ago

dpetersonVT23 commented 2 years ago

Is this the correct repository to compile and deploy a custom trained YOLOv5 model from the YOLOv5 Ti repository (https://github.com/TexasInstruments/edgeai-yolov5)?

I am having trouble figuring out where to start in this repo, ie where to put the trained weights and begin compilation. I have run the setup script and already trained my custom model using the edgeai-yolov5 repository.

Should I benchmark or compile first? What are the steps to do so successfully? Any guidance from this point is appreciated.

mathmanu commented 2 years ago

Do you mean BeagleBone AI-64 ? I thought that is the one that is based on TDA4VM.

dpetersonVT23 commented 2 years ago

Yes, I believe it is, thats why I believe the YOLOv5-TI model will work on it and be able to take advantage of the AI chip on the board.

mathmanu commented 2 years ago

Yes it should work on BeagleBone AI-64

mathmanu commented 2 years ago

But the version of TIDL-tools installed by edgeai-benchmark should match with what is installed in the EdgeAI-SDK software that you are using on the BB AI-64.

See here: https://github.com/TexasInstruments/edgeai-benchmark/blob/master/setup.sh#L66 By default, as of now 8.2 is installed and it will change to the next release when a new release is made.

You can install the appropriate version by uncommenting the appropriate lines and running the setup file.

dpetersonVT23 commented 2 years ago

Training has just completed, it is running TIDL in PC emulation mode right now. This was the initial output of compilation, still waiting for .tar.gz file (assuming all these warnings are normal):

Number of OD backbone nodes = 192 Size of odBackboneNodeIds = 192

Preliminary subgraphs created = 1 Final number of subgraphs created are : 1, - Offloaded Nodes - 295, Total Nodes - 295 TIDL Meta PipeLine (Proto) File : /home/edgeai/code/edgeai-modelmaker/data/projects/santa/run/20220824-160939/compilation/tda4vm/modelartifacts/8bits/od-8100_onnxrt_yolov5s6_640_ti_lite_weights_best_onnx_tda4vm/model/best.prototxt
yolo_v3 yolo_v3 Warning : Requested Output Data Convert Layer is not Added to the network, It is currently not Optimal

** Frame index 1 : Running float import ***** INFORMATION: [TIDL_ResizeLayer] Resize_107 Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize. INFORMATION: [TIDL_ResizeLayer] Resize_123 Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize. INFORMATION: [TIDL_ResizeLayer] Resize_139 Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize. WARNING: [TIDL_E_DATAFLOW_INFO_NULL] ti_cnnperfsim.out fails to allocate memory in MSMC. Please look into perfsim log. This model can only be used on PC emulation, it will get fault on target.


4 WARNINGS 0 ERRORS


0.0s: VX_ZONE_INIT:Enabled 0.7s: VX_ZONE_ERROR:Enabled 0.8s: VX_ZONE_WARNING:Enabled 0.404s: VX_ZONE_INIT:[tivxInit:178] Initialization Done !!!

dpetersonVT23 commented 2 years ago

Compilation has finished successfully and I have the .tar.gz file.

I am not sure my timeline for getting this deployed on the BBAI and see if the compiled model works as expected. I am happy to close this issue, and I will keep you updated if I make any progress with GPU usage in the docker container or deployment on the board. Will also watch for the updates with hyperparameter configurability and more. Thank you @mathmanu for your great help.

mathmanu commented 2 years ago

OK. Glad that it worked.

Mugutech62 commented 11 months ago

Compilation has finished successfully and I have the .tar.gz file.

I am not sure my timeline for getting this deployed on the BBAI and see if the compiled model works as expected. I am happy to close this issue, and I will keep you updated if I make any progress with GPU usage in the docker container or deployment on the board. Will also watch for the updates with hyperparameter configurability and more. Thank you @mathmanu for your great help.

Brother kindly can you please explain briefly, the steps required for compiling our own custom yolov5 ti lite model, literally please explain step by step, or else share me you work as document if it is possible 🙏