TexasInstruments / edgeai-benchmark

This repository has been moved. The new location is in https://github.com/TexasInstruments/edgeai-tensorlab
https://github.com/TexasInstruments/edgeai
Other
3 stars 0 forks source link

Issue running with ONNX quantized model with TIDLExecutionProvider #16

Open LucasFischer123 opened 1 year ago

LucasFischer123 commented 1 year ago

Hi,

I have an issue using the https://dev.ti.com/edgeaisession/ and the edgeai-benchmark.

I have developped a quantization method for neural networks using onnx and I wanted to test it with the TIDLExecutionProvider as I need to deploy it for one of my customer.

I have used the custom-model-onnx and I uploaded the following model based on [Mobilenet]. mobilenetv2-folded-quant.zip

As soon as I try to load it with the inference engine of TI. The jupyter kernel dies from the benchmarker dies

I have then tried with a non quantized model but with no more success, the jupyter dies as soon as I try to use the TIDLExecutionProvider mobilenetv2_fold.zip

Don't really know how to move forward.

Thanks for the help as I need to deploy for customers in coming weeks