apache / tvm

Open deep learning compiler stack for cpu, gpu and specialized accelerators
https://tvm.apache.org/
Apache License 2.0
11.58k stars 3.43k forks source link

TVMError: Binary was created using cuda but a loader of that name is not registered #13595

Open algoholic opened 1 year ago

algoholic commented 1 year ago

Earlier implementation in Docker was download and install tvm using CUDA flags on:

# install tvm
RUN git clone --recursive https://github.com/apache/incubator-tvm tvm && \
cd tvm && \
git reset --hard 338940dc5044885412f9a6045cb8dcdf9fb639a4 && \
git submodule init && \
git submodule update && \
mkdir ./build && \
cd build && \
cmake -DUSE_CUDA=ON -DUSE_CUDNN=ON -DUSE_CUBLAS=ON -DUSE_THRUST=ON -DUSE_LLVM=ON .. && \
make -j$(nproc) && \
cd ../python && \
python3.8 setup.py install && \
cd ../.. && rm -rf tvm

Now I'm just generating tvm wheel and then installing and using it in another docker container where cuda, cudnn is installed as well. Current code:

# TVM wheel generation from another place
RUN git clone --recursive https://github.com/apache/incubator-tvm tvm && \
cd tvm && \
git reset --hard 338940dc5044885412f9a6045cb8dcdf9fb639a4 && \
git submodule init && \
git submodule update && \
mkdir ./build && \
cp cmake/config.cmake build && \
cd build && \
cmake -DUSE_CUDA=ON -DUSE_CUDNN=ON -DUSE_CUBLAS=ON -DUSE_THRUST=ON -DUSE_LLVM=ON .. && \
make -j$(nproc) && \
cd ../python && $PYTHON_VERSION setup.py bdist_wheel

Install wheel in Docker: RUN python3.8 -m pip install tvm-0.8.dev1452+g338940dc5-cp38-cp38-linux_x86_64.whl

But is throwing me: TVMError: Binary was created using cuda but a loader of that name is not registered

Full tvm logs:

File "/usr/lib64/python3.8/site-packages/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
2022-12-12 15:25:54
raise get_last_ffi_error()
2022-12-12 15:25:54
tvm._ffi.base.TVMError: Traceback (most recent call last):
2022-12-12 15:25:54
6: TVMFuncCall
2022-12-12 15:25:54
5: std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), void tvm::runtime::TypedPackedFunc<tvm::runtime::Module (std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)>::AssignTypedLambda<tvm::runtime::Module (*)(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)>(tvm::runtime::Module (*)(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&), std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
2022-12-12 15:25:54
4: tvm::runtime::Module::LoadFromFile(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)
2022-12-12 15:25:54
3: std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), tvm::runtime::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
2022-12-12 15:25:54
2: tvm::runtime::CreateModuleFromLibrary(tvm::runtime::ObjectPtr<tvm::runtime::Library>)
2022-12-12 15:25:54
1: tvm::runtime::ProcessModuleBlob(char const*, tvm::runtime::ObjectPtr<tvm::runtime::Library>, tvm::runtime::Module*, tvm::runtime::ModuleNode**)
2022-12-12 15:25:54
0: tvm::runtime::LoadModuleFromBinary(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, dmlc::Stream*)
2022-12-12 15:25:54
File "/tmp/tvm/src/runtime/library_module.cc", line 116
2022-12-12 15:25:54
TVMError: Binary was created using cuda but a loader of that name is not registered. Available loaders are GraphRuntimeFactory, metadata, GraphExecutorFactory, VMExecutable. Perhaps you need to recompile with this runtime enabled.
guojilei commented 7 months ago

dear friend Have u fixed this issue? I meet same error

JamesSand commented 4 months ago

same issue +1

poltomo commented 2 weeks ago

same issue +1

Binary was created using {clml} but a loader of that name is not registered. Available loaders are opencl, GraphRuntimeFactory, GraphExecutorFactory, static_library, relax.Executable, AotExecutorFactory, metadata_module, VMExecutable, metadata, const_loader. Perhaps you need to recompile with this runtime enabled.

https://github.com/apache/tvm/issues/17281