apache / tvm

Open deep learning compiler stack for cpu, gpu and specialized accelerators
https://tvm.apache.org/
Apache License 2.0
11.76k stars 3.47k forks source link

[Bug] [paddlepaddle] Paddle Model transformation problem #13520

Closed lijiannan1241 closed 1 year ago

lijiannan1241 commented 1 year ago

[Paddle Model transformation problem]

model name: ViT_base_patch16_224 or DeiT_base_patch16_224

Paddle model conversion failed,where the transform errors as:

[Adapter][DEBUG] Unhandled exception: Traceback (most recent call last): File "/home/ncsdk/bin/paddle_phyrelay.py", line 155, in args.no_transforms, args.save_temps, args.loglevel == 'DEBUG') File "/home/ncsdk/bin/paddle_phyrelay.py", line 73, in convert_network mod, params = relay.frontend.from_paddle(model, shape_dict=input_shape_dict) File "/home/ncsdk/common/imgrelay/python/tvm/relay/frontend/paddlepaddle.py", line 2337, in from_paddle mod, params = g.from_translated_layer(program_or_layer, shape_dict) File "/home/ncsdk/common/imgrelay/python/tvm/relay/frontend/paddlepaddle.py", line 2285, in from_translated_layer self.ops_to_relay(program, input_specs) File "/home/ncsdk/common/imgrelay/python/tvm/relay/frontend/paddlepaddle.py", line 2245, in ops_to_relay convert_func(self, op, block) File "/home/ncsdk/common/imgrelay/python/tvm/relay/frontend/paddlepaddle.py", line 469, in convert_elementwise_op ipt0_shape = infer_shape(ipt0) File "/home/ncsdk/common/imgrelay/python/tvm/relay/frontend/common.py", line 526, in infer_shape out_type = infer_type(inputs, mod=mod) File "/home/ncsdk/common/imgrelay/python/tvm/relay/frontend/common.py", line 501, in infer_type new_mod = _transform.InferType()(new_mod) File "/home/ncsdk/common/imgrelay/python/tvm/ir/transform.py", line 161, in call return _ffi_transform_api.RunPass(self, mod) File "/home/ncsdk/common/imgrelay/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in call raise get_last_ffi_error() tvm._ffi.base.TVMError: Traceback (most recent call last): [bt] (8) /home/ncsdk/lib/libtvm.so(+0xfd5185) [0x149b2e910185] [bt] (7) /home/ncsdk/lib/libtvm.so(tvm::transform::Pass::operator()(tvm::IRModule) const+0x56) [0x149b2e90bfb6] [bt] (6) /home/ncsdk/lib/libtvm.so(tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x5c5) [0x149b2e90b5e5] [bt] (5) /home/ncsdk/lib/libtvm.so(tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x2ce) [0x149b2e90eb8e] [bt] (4) /home/ncsdk/lib/libtvm.so(+0x2356c50) [0x149b2fc91c50] [bt] (3) /home/ncsdk/lib/libtvm.so(tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)+0x68) [0x149b2fc90ca8] [bt] (2) /home/ncsdk/lib/libtvm.so(tvm::relay::TypeSolver::Solve()+0xb8b) [0x149b2f99a0cb] [bt] (1) /home/ncsdk/lib/libtvm.so(+0xc8d380) [0x149b2e5c8380] [bt] (0) /home/ncsdk/lib/libtvm.so(tvm::runtime::Backtrace[abi:cxx11]()+0x1d) [0x149b300399dd] File "/home/ncsdk/common/imgrelay/src/relay/analysis/type_solver.cc", line 624 TVMError:

An error occurred during the execution of TVM. For more information, please see: https://tvm.apache.org/docs/errors.html

Check failed: (false) is false: relay.concatenate requires all tensors have the same shape on non-concatenating axes [Adapter][CRITICAL] TVMError: Traceback (most recent call last): [TRUNCATED]

lijiannan1241 commented 1 year ago

use the tvm-v0.10.0

masahi commented 1 year ago

Please post a reproducible test script.