geaxgx / openvino_blazepose

59 stars 15 forks source link

hello, can the onnx be inferenced by tensorrt, and what's the mean and std, the model converted by the tflite2tensorflow tool does any special operation? the model has been div the std and sub the mean? #2

Open zcl912 opened 2 years ago

geaxgx commented 2 years ago

I am not sure to get your question. The name 'tflite2tensorflow' (https://github.com/PINTO0309/tflite2tensorflow) tool does not only convert a tflite file to tensorflow model, but it actually can convert to many different formats: saved_model, tfjs, tf-trt, EdgeTPU, CoreML, quantized tflite, ONNX, OpenVINO, Myriad Inference Engine blob. In my repo, I am just using the OpenVINO format. You can use the PINTO's tool to convert yourself the models. Or you can find also the already converted models for Blazepose in PINTO's model zoo: https://github.com/PINTO0309/PINTO_model_zoo/tree/main/053_BlazePose