tsnlab / connx

C implementation of Open Neural Network Exchange Runtime
GNU General Public License v3.0
26 stars 8 forks source link

CONNX - C implementation of ONNX runtime.

github Build Status GPL

CONNX

CONNX is abbreviation of C language implementation of ONNX Runtime. It's targeted to running on IoT devices such as ESP32, Raspberry Pi 3/4 or FreeRTOS. CONNX can be used as an alternative of tflite.

MNIST example on ESP32

Architecture

Features

Usage

How to use

CONNX running process overview

  1. Load the ONNX model.
  2. Create the runtime to run ONNX model.
  3. After feeding C-ONNX Tensor input into the runtime, the output is in C-ONNX Tensor format.

Installation instructions

Run examples

# Run MNIST example.
connx/build$ poetry run ninja mnist
# Run MOBILENET example.
connx/build$ poetry run ninja mobilenet
# Run YOLOV4 example.
connx/build$ poetry run ninja yolov4

or use connx excutable

connx/build$ ./connx ../examples/mnist/ ../examples/mnist/test_data_set_1/input_0.data

or use python script

connx$ python3 bin/run.py examples/mnist/ examples/mnist/test_data_set_1/input_0.data

Notice: If you want to run on Raspberry Pi 3, please compile with Release mode(CMAKE_BUILD_TYPE=Release) for sanitizer makes some problem.

Using python bindings

import connx

model = connx.load_model('examples/mnist')
input_data = connx.load_data('examples/mnist/test_data_set_0/input_0.data')
reference_data = connx.load_data('examples/mnist/test_data_set_0/output_0.data')

# Run model with input data
output_data = model.run([input_data])
# output_data is an array that contains output tensors

# Convert to numpy ndarray
reference_nparray = reference_data.to_nparray()
output_nparray = output_data[0].to_nparray

# Check output with reference_data
assert reference_data.shape == output_data[0].shape
import numpy
numpy.allclose(reference_nparray, output_nparray)

# You can also convert numpy.ndarray to connx.Tensor
connx.Tensor.from_nparray(ndarray)

Please refer bin/run.py for more information

ONNX compatibility test

ONNX compatibility test is moved to onnx-connx project.

Ports

Contribution

See CONTRIBUTING.md

Supported platforms

License

CONNX is licensed under GPLv3. See LICENSE If you need other license than GPLv3 for proprietary use or professional support, please mail us to contact at tsnlab dot com.