PaddlePaddle / Paddle2ONNX

ONNX Model Exporter for PaddlePaddle
Apache License 2.0
718 stars 168 forks source link

PaddleOCR量化模型转onnx报错 #1404

Open xdd130 opened 1 day ago

xdd130 commented 1 day ago

使用官网提供的模型,如下所示: image

问题描述 使用下述指令转换: paddle2onnx --model_dir . --model_filename inference.pdmodel --params_filename inference.pdiparams --deploy_backend onnxruntime --save_file ./rec_slim.onnx --opset_version 11 --enable_onnx_checker True

报错截图

image

其他信息

xdd130 commented 1 day ago

若使用官网非量化模型,如下所示: image

通过动态离线量化方式手动量化上述模型:

import paddle
import paddleslim

paddle.enable_static()
model_dir = "./ch_PP-OCRv3_rec_infer"
model_filename = 'inference.pdmodel'
params_filename = 'inference.pdiparams'
model_dir_quant_dynamic = "./output_rec"

paddleslim.quant.quant_post_dynamic(
    model_dir=model_dir, # 输入模型路径
    model_filename=model_filename, 
    params_filename=params_filename,
    save_model_dir=model_dir_quant_dynamic, 
    save_model_filename=model_filename, 
    save_params_filename=params_filename,
    weight_bits=8, 
)

再通过paddle2onnx进行模型转换:

paddle2onnx --model_dir . --model_filename inference.pdmodel --params_filename inference.pdiparams --deploy_backend onnxruntime --save_file ./rec_slim.onnx --opset_version 11 --enable_onnx_checker True

可以成功转换onnx模型,但在使用onnxruntime推理时,报如下错误:

 sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from ./onnxocr/models/ppocrv4/rec/model.onnx failed:This is an invalid model. Type Error: Type 'tensor(int8)' of input parameter (conv2d_37.w_0) of operator (Conv) in node (p2o.Conv.0) is invalid.

请问我该如何解决?