larq / compute-engine

Highly optimized inference engine for Binarized Neural Networks
https://docs.larq.dev/compute-engine
Apache License 2.0
240 stars 33 forks source link

Error on import #710

Closed MatFrancois closed 2 years ago

MatFrancois commented 2 years ago

Hello, I'm trying to use your lar_compute_engine package on a Jetson Nano 4g. I built it following the ARM building tutorial with make: https://docs.larq.dev/compute-engine/build/arm/

Unfortunately, I get the following message on import :

>>> import larq_compute_engine
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/greenai/Documents/matthieu/compute-engine/larq_compute_engine/__init__.py", line 1, in <module>
    from larq_compute_engine.mlir.python.converter import (
  File "/home/greenai/Documents/matthieu/compute-engine/larq_compute_engine/mlir/python/converter.py", line 8, in <module>
    from larq_compute_engine.mlir._tf_tfl_flatbuffer import (
ModuleNotFoundError: No module named 'larq_compute_engine.mlir._tf_tfl_flatbuffer'

Any idea to solve it would be great! Thanks

Tombana commented 2 years ago

The larq-compute-engine python package contains the converter which converts your Keras models to .tflite files. It is meant to be run on a x86 host machine. To run those .tflite files on a platform like the Jetson Nano, you need the runtime which is separate from the python package.

The ARM tutorial that you linked to shows how to build the runtime. This is not a python package but an executable (in the example it is lce_benchmark_model) which you can use to run the .tflite files on the target device.

MatFrancois commented 2 years ago

Oh ok thanks a lot for that explanation!