PaddlePaddle / Paddle

PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)
http://www.paddlepaddle.org/
Apache License 2.0
22.23k stars 5.58k forks source link

Compare Inference Perf BTW CPU and MKLDNN [OCR CRNN_CTC model] #10685

Closed luotao1 closed 6 years ago

luotao1 commented 6 years ago

Model

Original Model: https://github.com/PaddlePaddle/models/blob/develop/fluid/ocr_recognition/crnn_ctc_model.py

The final model likes:

4.0K    conv2d_0.b_0
4.0K    conv2d_0.w_0
4.0K    conv2d_1.b_0
12K conv2d_1.w_0
4.0K    conv2d_2.b_0
20K conv2d_2.w_0
4.0K    conv2d_3.b_0
40K conv2d_3.w_0
4.0K    conv2d_4.b_0
76K conv2d_4.w_0
4.0K    conv2d_5.b_0
148K    conv2d_5.w_0
4.0K    conv2d_6.b_0
292K    conv2d_6.w_0
4.0K    conv2d_7.b_0
580K    conv2d_7.w_0
4.0K    fc_0.b_0
904K    fc_0.w_0
4.0K    fc_1.b_0
904K    fc_1.w_0
44K fc_2.b_0
8.3M    fc_2.w_0
8.3M    fc_2.w_1
4.0K    gru_0.b_0
472K    gru_0.w_0
4.0K    gru_1.b_0
472K    gru_1.w_0
12K __model__

Test

A patch for test crnn_ctc model on C++ end. https://github.com/PaddlePaddle/Paddle/compare/develop...luotao1:ocr_test?expand=1

# build test
cd build
make test ARGS="-R test_crnn_ctc -V"

# run test
cd paddle/fluid/inference/tests/book
./test_crnn_ctc --dirname=DIR_PATH --batch_size=1 --repeat=10

Note that this will give the result of MKLDNN multi-threads, with single threads please try:

taskset -c 0 ./test_crnn_ctc --dirname=DIR_PATH --batch_size=1 --repeat=10

refer: #10651

shanyi15 commented 6 years ago

您好,此issue在近一个月内暂无更新,我们将于今天内关闭。若在关闭后您仍需跟进提问,可重新开启此问题,我们将在24小时内回复您。因关闭带来的不便我们深表歉意,请您谅解~感谢您对PaddlePaddle的支持! Hello, this issue has not been updated in the past month. We will close it today for the sake of other user‘s experience. If you still need to follow up on this question after closing, please feel free to reopen it. In that case, we will get back to you within 24 hours. We apologize for the inconvenience caused by the closure and thank you so much for your support of PaddlePaddle Group!