VertexC / dl-infer-perf

deep learning inference perf analysis
2 stars 1 forks source link

onnx->tvm fails on mobilenet/inception/resnet50 #4

Open VertexC opened 3 years ago

VertexC commented 3 years ago
(tvm-onnx-env) root@ubuntu1804-lts-base /s/dl-infer-perf# python infer_perf/onnx2tvm.py resnet50 --batch 64
Cannot find config for target=cuda -keys=cuda,gpu -max_num_threads=1024 -model=unknown -thread_warp_size=32, workload=('dense_small_batch.cuda', ('TENSOR', (1, 2048), 'float32'), ('TENSOR', (1000, 2048), 'float32'), None, 'float32'). A fallback configuration is used, which may bring great performance regression.
Traceback (most recent call last):
  File "infer_perf/onnx2tvm.py", line 70, in <module>
    backend=args.backend)
  File "infer_perf/onnx2tvm.py", line 42, in onnx2tvm_runner
    module.set_input(input_name, data)
  File "/scratch/tvm/python/tvm/contrib/graph_runtime.py", line 182, in set_input
    v.copyfrom(value)
  File "/scratch/tvm/python/tvm/runtime/ndarray.py", line 147, in copyfrom
    source_array.shape, shape
ValueError: array shape do not match the shape of NDArray (64, 3, 224, 224) vs (1, 3, 224, 224)
(tvm-onnx-env) root@ubuntu1804-lts-base /s/dl-infer-perf# python infer_perf/onnx2tvm.py resnet50 --batch 64
Traceback (most recent call last):
  File "infer_perf/onnx2tvm.py", line 70, in <module>
    backend=args.backend)
  File "infer_perf/onnx2tvm.py", line 38, in onnx2tvm_runner
    lib = relay.build(mod, target, params=params)
  File "/scratch/tvm/python/tvm/relay/build_module.py", line 269, in build
    graph_json, mod, params = bld_mod.build(mod, target, target_host, params)
  File "/scratch/tvm/python/tvm/relay/build_module.py", line 132, in build
    self._build(mod, target, target_host)
  File "/scratch/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
    raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
  [bt] (8) /scratch/tvm/build/libtvm.so(tvm::relay::backend::RelayBuildModule::Optimize(tvm::IRModule, tvm::Map<tvm::Integer, tvm::Target, void, void> const&, std::unordered_map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, tvm::runtime::NDArray, std::hash<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::equal_to<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, tvm::runtime::NDArray> > > const&)+0xeb2) [0x7fa63d6ae172]
  [bt] (7) /scratch/tvm/build/libtvm.so(tvm::transform::Pass::operator()(tvm::IRModule) const+0x69) [0x7fa63cb27d09]
  [bt] (6) /scratch/tvm/build/libtvm.so(tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x30b) [0x7fa63cc51f7b]
  [bt] (5) /scratch/tvm/build/libtvm.so(tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x24e) [0x7fa63cc51ebe]
  [bt] (4) /scratch/tvm/build/libtvm.so(tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x1b7) [0x7fa63cc52b07]
  [bt] (3) /scratch/tvm/build/libtvm.so(+0x1827e0d) [0x7fa63d66de0d]
  [bt] (2) /scratch/tvm/build/libtvm.so(tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)+0x67) [0x7fa63d66d0e7]
  [bt] (1) /scratch/tvm/build/libtvm.so(tvm::relay::TypeSolver::Solve()+0x1348) [0x7fa63d50be98]
  [bt] (0) /scratch/tvm/build/libtvm.so(+0x16c1a12) [0x7fa63d507a12]
  [bt] (8) /scratch/tvm/build/libtvm.so(tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x30b) [0x7fa63cc51f7b]
  [bt] (7) /scratch/tvm/build/libtvm.so(tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x24e) [0x7fa63cc51ebe]
  [bt] (6) /scratch/tvm/build/libtvm.so(tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x1b7) [0x7fa63cc52b07]
  [bt] (5) /scratch/tvm/build/libtvm.so(+0x1827e0d) [0x7fa63d66de0d]
  [bt] (4) /scratch/tvm/build/libtvm.so(tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)+0x67) [0x7fa63d66d0e7]
  [bt] (3) /scratch/tvm/build/libtvm.so(tvm::relay::TypeSolver::Solve()+0x37a) [0x7fa63d50aeca]
  [bt] (2) /scratch/tvm/build/libtvm.so(tvm::runtime::TypedPackedFunc<bool (tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&)>::AssignTypedLambda<bool (*)(tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&)>(bool (*)(tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&))::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const+0x4cc) [0x7fa63d12728c]
  [bt] (1) /scratch/tvm/build/libtvm.so(tvm::relay::ReshapeRel(tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&)+0x614) [0x7fa63d41f504]
  [bt] (0) /scratch/tvm/build/libtvm.so(+0x15a7b22) [0x7fa63d3edb22]
  File "/scratch/tvm/src/relay/analysis/type_solver.cc", line 624
TVMError:
---------------------------------------------------------------
An internal invariant was violated during the execution of TVM.
Please read TVM's error reporting guidelines.
More details can be found here: https://discuss.tvm.ai/t/error-reporting/7793.
---------------------------------------------------------------
  Check failed: false == false: [15:24:46] /scratch/tvm/src/relay/op/tensor/transform.cc:701:
---------------------------------------------------------------
An internal invariant was violated during the execution of TVM.
Please read TVM's error reporting guidelines.
More details can be found here: https://discuss.tvm.ai/t/error-reporting/7793.
---------------------------------------------------------------

  Check failed: oshape_sum == data_shape_sum (2048 vs. 131072) : Input tensor shape and reshaped shape are not compatible
VertexC commented 3 years ago

https://github.com/apache/tvm/issues/7476