Seanlinx / mtcnn

593 stars 264 forks source link

How do I optimize the model for the inference using TensorRT #71

Open santhoshnumberone opened 5 years ago

santhoshnumberone commented 5 years ago

I found this topic inside MxNET How do I use TensorRT integration?

Mainly to speed up the inference using TensorRT

Flow MxNET >>>> ONNX >>>> TENSORRT

to convert MxNet to ONNX i found this link How to convert a arcface mxnet model to onnx model?

but when i use

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import sys
import os
import argparse
import numpy as np
import mxnet as mx

parser = argparse.ArgumentParser(description='face model slim')
# general
parser.add_argument('--model', default='pnet-0016.params', help='path to load model.')
args = parser.parse_args()
epoch = 16
prefix = args.model
print('loading',prefix, epoch)
save_dict = mx.model.load_checkpoint(prefix, epoch)

I get this

loading pnet-0016.params 16 Traceback (most recent call last): File "model_slim.py", line 25, in save_dict = mx.model.load_checkpoint(prefix, epoch) File "C:\Users\Plato\Anaconda3\envs\gpu-env\lib\site-packages\mxnet\model.py", line 419, in load_checkpoint symbol = sym.load('%s-symbol.json' % prefix) File "C:\Users\Plato\Anaconda3\envs\gpu-env\lib\site-packages\mxnet\symbol\symbol.py", line 2535, in load check_call(_LIB.MXSymbolCreateFromFile(c_str(fname), ctypes.byref(handle))) File "C:\Users\Plato\Anaconda3\envs\gpu-env\lib\site-packages\mxnet\base.py", line 149, in check_call raise MXNetError(py_str(_LIB.MXGetLastError())) mxnet.base.MXNetError: [15:11:37] C:\Jenkins\workspace\mxnet-tag\mxnet\3rdparty\dmlc-core\src\io\local_filesys.cc:199: Check failed: allow_null LocalFileSystem::Open "pnet-0016.params-symbol.json": No such file or directory

It is expecting a json file which we don't have.

Can anyone help me, what i am doing wrong?

or what is the actual procedure of conversion?