Open zhaiyi000 opened 1 year ago
Hi there, executing the following code causes a crash:
from tvm.meta_schedule.testing.relay_workload import get_network get_network('bert_base', [1, 128])
(py38) root@d243fdd3cf8c:~/tlm/build# pip list | grep torch torch 1.13.1 torchaudio 0.13.1 torchvision 0.14.1 (py38) root@d243fdd3cf8c:~/tlm/build# cat /etc/issue Ubuntu 20.04.6 LTS \n \l
Traceback (most recent call last): File "test.py", line 2, in <module> get_network('bert_base', [1, 128]) File "/root/tlm/python/tvm/meta_schedule/testing/relay_workload.py", line 247, in get_network mod, params_bytearray, inputs = _get_network((name, input_shape, layout)) # use mp kind of slow, not use may lead oom File "/root/tlm/python/tvm/meta_schedule/testing/relay_workload.py", line 164, in _get_network mod, params = relay.frontend.from_pytorch(scripted_model, shape_list) File "/root/tlm/python/tvm/relay/frontend/pytorch.py", line 5002, in from_pytorch outputs = converter.convert_operators(operator_nodes, outputs, ret_name) File "/root/tlm/python/tvm/relay/frontend/pytorch.py", line 4256, in convert_operators relay_out = relay_op( File "/root/tlm/python/tvm/relay/frontend/pytorch.py", line 1732, in linear mm_out = self.matmul( File "/root/tlm/python/tvm/relay/frontend/pytorch.py", line 1961, in matmul a = _op.broadcast_to(a, batch_shape + list(a_shape[-2:])) File "/root/tlm/python/tvm/relay/op/transform.py", line 865, in broadcast_to return _make.broadcast_to(data, shape) File "/root/tlm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__ raise get_last_ffi_error() tvm._ffi.base.TVMError: Traceback (most recent call last): 2: TVMFuncCall 1: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::RelayExpr (tvm::RelayExpr, tvm::runtime::Array<tvm::Integer, void>)>::AssignTypedLambda<tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::Integer, void>)>(tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::Integer, void>), std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) 0: tvm::runtime::TVMMovableArgValueWithContext_::operator tvm::runtime::Array<tvm::Integer, void><tvm::runtime::Array<tvm::Integer, void> >() const 3: TVMFuncCall 2: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::RelayExpr (tvm::RelayExpr, tvm::runtime::Array<tvm::Integer, void>)>::AssignTypedLambda<tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::Integer, void>)>(tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::Integer, void>), std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) 1: tvm::runtime::TVMMovableArgValueWithContext_::operator tvm::runtime::Array<tvm::Integer, void><tvm::runtime::Array<tvm::Integer, void> >() const 0: tvm::runtime::Array<tvm::Integer, void> tvm::runtime::TVMPODValue_::AsObjectRef<tvm::runtime::Array<tvm::Integer, void> >() const File "/root/tlm/include/tvm/runtime/packed_func.h", line 777 TVMError: In function relay.op._make.broadcast_to(0: RelayExpr, 1: Array<IntImm>) -> RelayExpr: error while converting argument 1: [02:00:59] /root/tlm/include/tvm/runtime/packed_func.h:1866: --------------------------------------------------------------- An error occurred during the execution of TVM. For more information, please see: https://tvm.apache.org/docs/errors.html --------------------------------------------------------------- Check failed: (!checked_type.defined()) is false: Expected Array[IntImm], but got Array[index 0: tir.Any]
Hi! I met the same error. Have you solved the problem?
Hi there, executing the following code causes a crash: