PaddlePaddle / Paddle-Lite

PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)
https://www.paddlepaddle.org.cn/lite
Apache License 2.0
6.96k stars 1.61k forks source link

ONNX 转 pdlite 提示不支持 GRU? #10374

Open Asxin opened 1 year ago

Asxin commented 1 year ago

使用 x2paddle 将我的 ONNX 模型转换 pdlite 模型,提示GRU 不支持。。。GRU 属于最基础的算子吧?pdlite 不支持么? x2paddle --framework=onnx --model=model.onnx --save_dir=./pd_model

model ir_version: 6, op version: 11 Shape inferencing ... [WARNING] Incomplete symbolic shape inference Shape inferenced. Now, onnx2paddle support convert onnx model opset_verison [7, 8, 9, 10, 11, 12, 13, 14, 15], opset_verison of your onnx model is 11.

========= 1 OPs are not supported yet =========== ========== GRU ============ Traceback (most recent call last): File "/home/gyf/.local/bin/x2paddle", line 8, in sys.exit(main()) File "/home/gyf/.local/lib/python3.10/site-packages/x2paddle/convert.py", line 489, in main onnx2paddle( File "/home/gyf/.local/lib/python3.10/site-packages/x2paddle/convert.py", line 304, in onnx2paddle mapper = ONNXOpMapper(model) File "/home/gyf/.local/lib/python3.10/site-packages/x2paddle/op_mapper/onnx2paddle/onnx_op_mapper.py", line 38, in init raise Exception("Model is not supported yet.") Exception: Model is not supported yet.

相同的模型使用 MNN 的转换工具则没问题,部署C++推理也都OK。

Asxin commented 1 year ago

查阅这个 op 支持列表:https://github.com/PaddlePaddle/X2Paddle/blob/develop/docs/inference_model_convertor/op_list.md GRU 不在列表中,那就是不支持么?

GRU 算是常规算子吧,会支持么

engineer1109 commented 1 year ago

不会支持了,去看Paddle Inference吧。Mobile也能用Paddle Inference。 如果Paddle Inference都没有,那就是真没有。 Paddle-Lite已经寄了

Asxin commented 1 year ago

好的,谢谢