I am following the instructions for AoT compilation of AI models here. Specifically, I built tvm from source with set(USE_LLVM ON), set(USE_MICRO ON), and set(USE_MICRO_STANDALONE_RUNTIME ON). I should expect to be able to run make inside the tvm/apps/bundle_deploy directory to compile some example inference code into standalone executables.
Actual behavior
Always getting the following error:
ls: ../../build/standalone_crt: No such file or directory
Makefile:24: *** "CRT not found. Ensure you have built the standalone_crt target and try again". Stop.
The following are the contents of my build directory after building tvm with set(USE_LLVM ON), set(USE_MICRO ON), and set(USE_MICRO_STANDALONE_RUNTIME ON) changed in tvm/cmake/config.cmake:
Expected behavior
I am following the instructions for AoT compilation of AI models here. Specifically, I built
tvm
from source withset(USE_LLVM ON)
,set(USE_MICRO ON)
, andset(USE_MICRO_STANDALONE_RUNTIME ON)
. I should expect to be able to runmake
inside thetvm/apps/bundle_deploy
directory to compile some example inference code into standalone executables.Actual behavior
Always getting the following error:
The following are the contents of my build directory after building
tvm
withset(USE_LLVM ON)
,set(USE_MICRO ON)
, andset(USE_MICRO_STANDALONE_RUNTIME ON)
changed intvm/cmake/config.cmake
:Environment
I tried doing this on bare metal Intel MacOS as well as via the below Dockerfile with the same results / error.
Steps to reproduce
tvm
locally.tvm/cmake/config.smake
to enableset(USE_LLVM ON)
,set(USE_MICRO ON)
,set(USE_MICRO_STANDALONE_RUNTIME ON)
docker build -f Dockerfile.tvm -t tvm-bundle-test .
Here is the reference Dockerfile:
Triage
Please refer to the list of label tags here to find the relevant tags and add them below in a bullet format (example below).