OAID / Tengine-Convert-Tools

Tengine Convert Tool supports converting multi framworks' models into tmfile that suitable for Tengine-Lite AI framework.
Apache License 2.0
93 stars 34 forks source link
artificial-intelligence cnn dnn mxnet onnx pytorch tensorflow

Tengine Convert Tools

GitHub license Build Status codecov

Introduction

Tengine Convert Tool supports converting multi framworks' models into tmfile that suitable for Tengine-Lite AI framework. Since this tool relys on protobuf to resolve proto file of Caffe, ONNX, TensorFlow, TFLite and so on, it can only run under x86 Linux system.

Install dependent libraries

Build Convert Tool

mkdir build && cd build
cmake ..
make -j`nproc` && make install

Exection File

Run Convert Tool

How to use

$ ./convert_tool -h
[Convert Tools Info]: optional arguments:
        -h    help            show this help message and exit
        -f    input type      path to input float32 tmfile
        -p    input structure path to the network structure of input model(*.prototxt, *.symbol, *.cfg)
        -m    input params    path to the network params of input model(*.caffemodel, *.params, *.weight, *.pb, *.onnx, *.tflite)
        -o    output model    path to output fp32 tmfile

To run the convert tool, running as following command, Note: The command examples are based on mobilenet model:

How to enable MegEngine support[optional]

# clone MegEngine
git clone https://github.com/MegEngine/MegEngine.git

# prepare for building
cd MegEngine
./third_party/prepare.sh
./third_party/install-mkl.sh
mkdir build && cd build

# build MegEngine with DEBUG mode
cmake .. -DMGE_WITH_TEST=OFF -DMGE_BUILD_SDK=OFF -DCMAKE_BUILD_TYPE=Debug -DCMAKE_INSTALL_PREFIX={PREDEFINED_INSTALL_PATH}
make -j`nproc`
make install
make develop

# export environment
export PYTHONPATH=/path/to/MegEngine/python_module

# test with python
 python3 -c "import megengine"
# clone Tengine convert tool
git clone https://github.com/OAID/Tengine-Convert-Tools

# build with MegEngine support
cmake -DBUILD_MEGENGINE_SERIALIZER=ON -DMEGENGINE_INSTALL_PATH={PREDEFINED_INSTALL_PATH} ..
make -j`nproc`
make install
# get a pre-trained resnet18 model from MegEngine Model Hub
import megengine.hub
resnet18 = megengine.hub.load("megengine/models", "resnet18", pretrained=True)

# use MegEngine trace to deal with downloaded model
from megengine.jit import trace
import megengine.functional as F

@trace(symbolic=True)
def pred_fun(data, *, net):
    net.eval()
    pred = net(data)
    # if model has softmax
    pred_normalized = F.softmax(pred)
    return pred_normalized

# fill a random input for model
import numpy as np
data = np.random.random((1, 3, 224, 224)).astype(np.float32)

# trace and save the model
pred_fun.trace(data, net=resnet18)
pred_fun.dump('new_model.pkl')

A jupyter notebook was offered for users, check MegEngine.ipynb.

./install/bin/convert_tool -f megengine -m new_model.pkl -o resnet18.tmfile

How to add self define operator

License

Tech Forum