Open LiuPeiqiCN opened 1 year ago
I met same error message. I passed --use_cuda
and --enable_training
, and they caused https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/build.py#L1963 and https://github.com/microsoft/onnxruntime/blob/main/setup.py#L523,
and eventually package_name is not "onnxruntime-tvm":
https://github.com/microsoft/onnxruntime/blob/main/setup.py#L597.
I solved it by remove --use_cuda
and --enable_training
arguments. Not sure what caused your error. But basiclly, you need to make sure the package_name is "onnxruntime-tvm", which means code runs to https://github.com/microsoft/onnxruntime/blob/main/setup.py#L598.
Hello, does the Python interface of onnxruntime-tvm support cuda, I use the target==cuda,Error reporting
Describe the issue
Code:
Exception:
Build:
How to use TVM with c++? Any example/document?
To reproduce
Exception during initialization when use TVM execution provider.
Urgency
No response
Platform
Windows
OS Version
win11
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.13.1
ONNX Runtime API
C++
Architecture
X64
Execution Provider
TVM
Execution Provider Library Version
No response