Edgeai TIDL Tools and Examples - This repository contains Tools and example developed for Deep learning runtime (DLRT) offering provided by TI’s edge AI solutions.
python onnxrt_ep.py -c
Available execution providers : ['TIDLExecutionProvider', 'TIDLCompilationProvider', 'CPUExecutionProvider']
Running 1 Models - ['cl-dlr_mobilenetv3_large']
Running_Model : cl-dlr_mobilenetv3_large
Running shape inference on model mobilenetv3_large_qdq.onnx
Preliminary subgraphs created = 30
*** WARNING : Number of subgraphs generated > max_num_subgraphs provided in options - additional subgraphs are delegated to ARM ***
Final number of subgraphs created are : 16, - Offloaded Nodes - 331, Total Nodes - 493
Could not find const or initializer of layer /module/0_34/Conv !!!
Weight Tensor size is not matching with Proto kernel_shape
Could not find const or initializer of layer /module/0_34/Conv !!!
Unsupported Onnx import data type : 0
Could not find const or initializer of layer /module/0_34/Conv !!!
Unsupported Onnx import data type : 0
Could not find const or initializer of layer /module/fc1_4/Conv !!!
Weight Tensor size is not matching with Proto kernel_shape
Could not find const or initializer of layer /module/fc1_4/Conv !!!
Unsupported Onnx import data type : 0
Could not find const or initializer of layer /module/fc1_4/Conv !!!
Unsupported Onnx import data type : 0
Could not find const or initializer of layer /module/fc2_4/Conv !!!
Weight Tensor size is not matching with Proto kernel_shape
Could not find const or initializer of layer /module/fc2_4/Conv !!!
Hi,
I'm attempting to compile an QDQ model with the config option:
but it seems the QDQ model is unsupported:
How to compile the QDQ model?
@mathmanu @debapriyamaji @gibrane @kumardesappan @yuanzhaoti
Thank you!