Open musicgary opened 5 months ago
看起来(convert_einsum_to_exmatmul)是自动优化einsum时出了错误,实际上在2.0版本之前他甚至不支持爱因斯坦求和算子,所以我猜测大概率还是对这个算子的支持度不高导致的,我建议自己将这个算子等效转换为其他的矩阵操作,这很好实现
I have raised the same error when converting the flowformer and flowformer++ ONNX models to RKNN. I include the error message below in case it helps diagnosis of the error.
See my error output below.
D unsqueeze_to_4d_add: remove node = [], add node = ['/memory_encoder/feat_encoder/blocks.1.1_1/Add_output_0_rs', '/memory_encoder/feat_encoder/blocks.1.1/mlp/fc2_1/Add_output_0_rs', '/memory_encoder/feat_encoder/blocks.1.1_1/Add_1_output_0-rs']
W build: Show op fuse match nodes:
Ruler: convert_einsum_to_exmatmul
Subgraph:
Op type: Einsum
Op name: /memory_encoder/Einsum
Input:
/memory_encoder/Reshape_1_output_0 : [1, 1, 1980, 256]
/memory_encoder/Reshape_3_output_0 : [1, 1, 1980, 256]
Output:
/memory_encoder/Einsum_output_0 : [1, 1, 1980, 1980]
Attribute:
equation : bhid, bhjd -> bhij
E build: Traceback (most recent call last):
File "rknn/api/rknn_log.py", line 309, in rknn.api.rknn_log.error_catch_decorator.error_catch_wrapper
File "rknn/api/rknn_base.py", line 1901, in rknn.api.rknn_base.RKNNBase.build
File "rknn/api/graph_optimizer.py", line 2069, in rknn.api.graph_optimizer.GraphOptimizer.fuse_ops
File "rknn/api/rules/matmul.py", line 2130, in rknn.api.rules.matmul._p_convert_einsum_to_exmatmul
File "rknn/api/rules/matmul.py", line 2122, in rknn.api.rules.matmul._p_convert_einsum_to_exmatmul._get_dim_type
IndexError: list index out of range
W build: ===================== WARN(2) =====================
E rknn-toolkit2 version: 2.0.0b23+29ceb58d
Traceback (most recent call last):
File "rknn/api/rknn_log.py", line 309, in rknn.api.rknn_log.error_catch_decorator.error_catch_wrapper
File "rknn/api/rknn_base.py", line 1901, in rknn.api.rknn_base.RKNNBase.build
File "rknn/api/graph_optimizer.py", line 2069, in rknn.api.graph_optimizer.GraphOptimizer.fuse_ops
File "rknn/api/rules/matmul.py", line 2130, in rknn.api.rules.matmul._p_convert_einsum_to_exmatmul
File "rknn/api/rules/matmul.py", line 2122, in rknn.api.rules.matmul._p_convert_einsum_to_exmatmul._get_dim_type
IndexError: list index out of range
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/████/█████/████/████/rknn-toolkit2/rknn-toolkit2/examples/onnx/flowformer/convert.py", line 40, in <module>
ret = rknn.build(do_quantization=DO_QUANTIZATION, dataset='./dataset.txt')
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "████/█████/████/████/████/lib/python3.11/site-packages/rknn/api/rknn.py", line 204, in build
return self.rknn_base.build(do_quantization=do_quantization, dataset=dataset, expand_batch_size=rknn_batch_size)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "rknn/api/rknn_log.py", line 314, in rknn.api.rknn_log.error_catch_decorator.error_catch_wrapper
File "rknn/api/rknn_log.py", line 95, in rknn.api.rknn_log.RKNNLog.e
ValueError: Traceback (most recent call last):
File "rknn/api/rknn_log.py", line 309, in rknn.api.rknn_log.error_catch_decorator.error_catch_wrapper
File "rknn/api/rknn_base.py", line 1901, in rknn.api.rknn_base.RKNNBase.build
File "rknn/api/graph_optimizer.py", line 2069, in rknn.api.graph_optimizer.GraphOptimizer.fuse_ops
File "rknn/api/rules/matmul.py", line 2130, in rknn.api.rules.matmul._p_convert_einsum_to_exmatmul
File "rknn/api/rules/matmul.py", line 2122, in rknn.api.rules.matmul._p_convert_einsum_to_exmatmul._get_dim_type
IndexError: list index out of range
Dear all,
I have an .onnx model from STGCN++ model. Using the rknn-toolkit 2 verson 2.0b, when I wanted to convert the model to .rknn for deployment in RK3588, I encountered the following problem: (opset 17)
I read from the error log and tried to convert using opset 19, still no luck: The error log is as follows:
Does it mean that I have to exclude the einsum operator or ReduceMean operator in deployment in RK3588? Is there any workaround for that? Thanks very much for your help.