TexasInstruments / edgeai-tidl-tools

Edgeai TIDL Tools and Examples - This repository contains Tools and example developed for Deep learning runtime (DLRT) offering provided by TI’s edge AI solutions.
Other
121 stars 27 forks source link

TIDL - TI Deep Learning Product

TIDL is a comprehensive software product for acceleration of Deep Neural Networks (DNNs) on TI's embedded devices. It supports heterogeneous execution of DNNs across cortex-A based MPUs, TI’s latest generation C7x DSP and TI's DNN accelerator (MMA). TIDL is released as part of TI's Software Development Kit (SDK) along with additional computer vision functions and optimized libraries including OpenCV. TIDL is available on a variety of embedded devices from Texas Instruments.

TIDL is a fundamental software component of TI’s Edge AI solution. TI's Edge AI solution simplifies the whole product life cycle of DNN development and deployment by providing a rich set of tools and optimized libraries. DNN based product development requires two main streams of expertise:

TI's Edge AI solution provides the right set of tools for both of these categories:

The figure below illustrates the work flow of DNN development and deployment on TI devices:

TI EdgeAI Work Flow

EdgeAI TIDL Tools

Introduction

TIDL provides multiple deployment options with industry defined inference engines as listed below. These inference engines are being referred as Open Source Runtimes (OSRT) in this document.

* AM68PA has cortex-A72 as its MPU, refer to the device TRM to know which cortex-A MPU* it contains.

These heterogeneous execution enables:

  1. OSRT as the top level inference for user applications
  2. Offloading subgraphs to C7x/MMA for accelerated execution with TIDL
  3. Runs optimized code on ARM core for layers that are not supported by TIDL

Edge AI TIDL Tools provided in this repository supports model compilation and model inference. The diagram below illustrates the TFLite based work flow as an example. ONNX Runtime and TVM/Neo-AI Runtime also follows similar work flow.

The below table covers the supported operations with this repository on X86_PC and TI's development board.

| Operation | X86_PC | TI SOC |Python API |CPP API| | ------- |:-----------:|:-----------:|:-----------:|:-----------:| | Model Compilation | :heavy_check_mark: |:x: | :heavy_check_mark: |:x:| | Model Inference | :heavy_check_mark: | :heavy_check_mark: |:heavy_check_mark: |:heavy_check_mark:|

What IS Supported

What IS NOT Supported

Supported Devices

Device Family(Product) Environment Variable Hardware Acceleration
AM62 am62 :x:
AM62A am62a :heavy_check_mark:
AM67A am67a :heavy_check_mark:
AM68PA am68pa :heavy_check_mark:
AM68A am68a :heavy_check_mark:
AM69A am69a :heavy_check_mark:
J721E (TDA4VM) am68pa :heavy_check_mark:
J721S2 (TDA4AL, TDA4VL) am68a :heavy_check_mark:
J722S am67a :heavy_check_mark:
J784S4 (TDA4AP, TDA4VP,
TDA4AH, TDA4VH)
am69a :heavy_check_mark:

Setup

Note Please select / checkout to the tag compatible with the SDK version that you are using with the TI's Evaluation board before continuing on the below steps. Refer to SDK Version compatibility Table for the tag of your SDK version

Pre-requisites to setup on x86_PC

OS Python Version
Ubuntu 22.04 3.10

Setup on X86_PC

  sudo apt-get install libyaml-cpp-dev

Note source in the setup command is important as this script is exporting all required environment variables. Without this, user may encounter some compilation/runtime issues

 git clone https://github.com/TexasInstruments/edgeai-tidl-tools.git
 cd edgeai-tidl-tools
 git checkout <TAG Compatible with your SDK version>
 # Supported SOC name strings am62, am62a, am68a, am68pa, am69a, am67a
 export SOC=<Your SOC name>
 source ./setup.sh

Validate and Benchmark out-of-box examples

Compile and Validate on X86_PC

model-artifacts/
models/
output_images/
test_report_pc.csv
Image Classification Object detection Semantic Segmentation

Benchmark on TI SOC

 git clone https://github.com/TexasInstruments/edgeai-tidl-tools.git
 cd edgeai-tidl-tools
 git checkout <TAG Compatible with your SDK version>
 export SOC=<Your SOC name>
 export TIDL_TOOLS_PATH=$(pwd)
# scp -r <pc>/edgeai-tidl-tools/model-artifacts/  <dev board>/edgeai-tidl-tool/
# scp -r <pc>/edgeai-tidl-tools/models/  <dev board>/edgeai-tidl-tool/
mkdir build && cd build
cmake ../examples && make -j && cd ..
python3 ./scripts/gen_test_report.py

Compile and Benchmark Custom Model

User Guide