TexasInstruments / edgeai-modelzoo

This repository has been moved. The new location is in https://github.com/TexasInstruments/edgeai-tensorlab
https://github.com/TexasInstruments/edgeai
Other
2 stars 0 forks source link

Questions regarding to model import #2

Closed zengpeng629 closed 3 years ago

zengpeng629 commented 3 years ago

Hi TI team,

I am using ssdlite onnx model provided here:

https://github.com/TexasInstruments/edgeai-modelzoo/blob/master/modelartifacts/8bits/od-8020_onnxrt_edgeai-mmdet_ssd-lite_mobilenetv2_512x512_20201214_model_onnx.tar.gz.link

to test the import tool provided by TIDL 7.3, the compile is error-free but it create a lot of graphs which are very strange compare to the artifacts from the download link (only 1 .bin model) , those are generated models:

624_tidl_io_1.bin 632_tidl_io_1.bin 640_tidl_io_1.bin 648_tidl_io_1.bin 656_tidl_io_1.bin 664_tidl_io_1.bin allowedNode.txt 624_tidl_net.bin 632_tidl_net.bin 640_tidl_net.bin 648_tidl_net.bin 656_tidl_net.bin 664_tidl_net.bin tempDir 628_tidl_io_1.bin 636_tidl_io_1.bin 644_tidl_io_1.bin 652_tidl_io_1.bin 660_tidl_io_1.bin 668_tidl_io_1.bin 628_tidl_net.bin 636_tidl_net.bin 644_tidl_net.bin 652_tidl_net.bin 660_tidl_net.bin 668_tidl_net.bin

I followed every step provided here

https://software-dl.ti.com/jacinto7/esd/processor-sdk-rtos-jacinto7/07_03_00_07/exports/docs/tidl_j7_02_00_00_07/ti_dl/docs/user_guide_html/md_tidl_osr_onnxrt_tidl.html

to convert onnx model to TIDL bin models, the command lines are very easy but it looks like a black box to me, so that it is very hard to figure out what happend during convertion.

Another problem is that, above list of graphs can be used directly for inference using

python3 onnxrt_ep.py

but the speed is very slow which also made confused since it is ssdlite:

1, ssdlite, Total time : 1751.7ms, Offload Time : 1629.5ms

Could you give me some hint about why it created so many graphs? and why the speed is slow? One of the reason is that I think the TIDL 7.3 is too old, 8.0 might help, but just not sure.

Also I cannot run inference using the artifacts provided in the download link, here is the error:

python onnxrt_ep.py Available execution providers : ['TIDLExecutionProvider', 'TIDLCompilationProvider', 'CPUExecutionProvider']

Running 1 Models - ['ssdlite']

Running model ssdlite

libtidl_onnxrt_EP loaded 0x565327250bb0 0.0s: VX_ZONE_INIT:Enabled 0.8s: VX_ZONE_ERROR:Enabled 0.11s: VX_ZONE_WARNING:Enabled

Preliminary subgraphs created = 1 Final number of subgraphs created are : 1, - Offloaded Nodes - 482, Total Nodes - 482 TIDL_RT_OVX: ERROR: Config file size (37256 bytes) does not match size of sTIDL_IOBufDesc_t (36360 bytes) 0.93229s: VX_ZONE_ERROR:[tivxAddKernelTIDL:260] invalid values for num_input_tensors or num_output_tensors 0.94293s: VX_ZONE_ERROR:[ownTensorCheckSizes:187] Invalid view parameter(s) in dimension: 0 Segmentation fault (core dumped)

Could you help me with it?

Best, Zeng

mathmanu commented 3 years ago

You have to provide 'object_detection:meta_layers_names_list' and 'object_detection:meta_arch_type' for the offload to be able to optimize the detection post processing layers.

https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/examples/osrt_python/ort/onnxrt_ep.py#L114 https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/examples/osrt_python/ort/utils.py#L306

Please try it.

mathmanu commented 3 years ago

You can also see the parameters that we used to compile the model here: https://github.com/TexasInstruments/edgeai-benchmark/blob/master/configs/detection.py#L83 But this is from the edgeai-benchmark repository which is a wrapper over edgeai-tidl-tools - you can use it for reference even if you are directly using edgeai-tidl-tools.

zengpeng629 commented 3 years ago

You have to provide 'object_detection:meta_layers_names_list' and 'object_detection:meta_arch_type' for the offload to be able to optimize the detection post processing layers.

https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/examples/osrt_python/ort/onnxrt_ep.py#L114 https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/examples/osrt_python/ort/utils.py#L306

Please try it. Hi Manu,

Yes I have added those configs: 'meta_layers_names_list' : os.path.join(models_base_path, 'ssdlite/ssd-lite_mobilenetv2_512x512_20201214_model.prototxt'), 'meta_arch_type' : 3

and then add to delegate_options, but still it generated a lot of graphs, most of them only have 4 nodes image

image

mathmanu commented 3 years ago

I think those shoudl be 'object_detection:meta_layers_names_list' and 'object_detection:meta_arch_type'

Please show your delegate options

zengpeng629 commented 3 years ago

I think those shoudl be 'object_detection:meta_layers_names_list' and 'object_detection:meta_arch_type'

Please show your delegate options

(Pdb) delegate_options {'tidl_tools_path': '../../../tidl_tools/', 'artifacts_folder': './onnxrt-artifacts/ssdlite', 'platform': 'J7', 'version': '7.3', 'tensor_bits': 8, 'debug_level': 0, 'max_num_subgraphs': 1, 'deny_list': '', 'accuracy_level': 1, 'advanced_options:calibration_frames': 3, 'advanced_options:calibration_iterations': 3, 'advanced_options:output_feature_16bit_names_list': '', 'advanced_options:params_16bit_names_list': '', 'advanced_options:quantization_scale_type': 0, 'advanced_options:high_resolution_optimization': 0, 'advanced_options:pre_batchnorm_fold': 1, 'ti_internal_nc_flag': 1601, 'advanced_options:activation_clipping': 1, 'advanced_options:weight_clipping': 1, 'advanced_options:bias_calibration': 1, 'advanced_options:channel_wise_quantization': 0, 'object_detection:meta_layers_names_list': './onnx_models/ssdlite/ssd-lite_mobilenetv2_512x512_20201214_model.prototxt', 'object_detection:meta_arch_type': 3}

mathmanu commented 3 years ago

I think this optimized offloading of detection layers is supported only from 8.0 onwards. Please install edgeai-tidl-tools 8.0 using this script and try: https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/setup.sh

zengpeng629 commented 3 years ago

I think this optimized offloading of detection layers is supported only from 8.0 onwards. Please install edgeai-tidl-tools 8.0 using this script and try: https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/setup.sh

I also think it is a version problem, thank you!

zengpeng629 commented 3 years ago

I think this optimized offloading of detection layers is supported only from 8.0 onwards. Please install edgeai-tidl-tools 8.0 using this script and try: https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/setup.sh

Hi,

Currently I cannot upgrade the ti-processor-sdk-rtos 7.3 to 8.0 on my PC and TDA4 due to some issues, I want to ask if it is possible to only install edgeai-tidl-tools 8.0?

mathmanu commented 3 years ago

Yes. Just run the setup file that I shared. But the resulting compiled model will work only in 8.0, it won't work in 7.3