airockchip / rknn-toolkit2

Other
991 stars 104 forks source link

推理结果对不上 #132

Open lrzss opened 2 months ago

lrzss commented 2 months ago

同一个模型,python连板推理没问题,使用RKNPU c++板端部署,结果完全对不上,这是为啥阿?

lrzss commented 2 months ago

rknpu c++的推理结果 python连板推理

yuyun2000 commented 2 months ago

模型精度有问题呗 先用rknntoolkit的分析接口连板看一下精度

lrzss commented 2 months ago

您好,我没对模型进行量化

郭亚雄 @.***

 

------------------ 原始邮件 ------------------ 发件人: "airockchip/rknn-toolkit2" @.>; 发送时间: 2024年8月27日(星期二) 上午9:41 @.>; @.**@.>; 主题: Re: [airockchip/rknn-toolkit2] 推理结果对不上 (Issue #132)

模型精度有问题呗 先用rknntoolkit的分析接口连板看一下精度

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

yuyun2000 commented 2 months ago

不要担心,即使你没有量化,也是有可能有这种问题的,先用analy接口看看精度

lrzss commented 2 months ago

好的,我先看看

郭亚雄 @.***

 

------------------ 原始邮件 ------------------ 发件人: "airockchip/rknn-toolkit2" @.>; 发送时间: 2024年8月27日(星期二) 上午9:46 @.>; @.**@.>; 主题: Re: [airockchip/rknn-toolkit2] 推理结果对不上 (Issue #132)

不要担心,即使你没有量化,也是有可能有这种问题的,先用analy接口看看精度

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

lrzss commented 2 months ago

您好,精度分析,是不是只能用这个流程哦 ,  能否直接加载转换后的rknn模型,进行分析阿  (我记得之前好像试过不行)

郭亚雄 @.***

 

------------------ 原始邮件 ------------------ 发件人: "airockchip/rknn-toolkit2" @.>; 发送时间: 2024年8月27日(星期二) 上午9:46 @.>; @.**@.>; 主题: Re: [airockchip/rknn-toolkit2] 推理结果对不上 (Issue #132)

不要担心,即使你没有量化,也是有可能有这种问题的,先用analy接口看看精度

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

yuyun2000 commented 2 months ago

连板精度分析的就是转换后的rknn模型,只不过需要重新转一遍;你精度不对也有可能是cpp的代码写的有问题,但是精度分析的结果是肯定没问题的

lrzss commented 2 months ago

这是代码: rknn.config( mean_values=[[123.675, 116.28, 103.53]], std_values=[[58.395, 57.12, 57.375]], target_platform='rk3588')

使用load_xxx接口加载常用深度学习模型

ret = rknn.load_onnx(model='onnx_model/fear_z.onnx')

使用build接口构建RKNN模型

rknn.build( do_quantization=False, dataset='dataset1.txt' )

使用accuracy_analysis 接口进行模型量化精度分析

rknn.accuracy_analysis( inputs=["image128.jpg"], # inputs用来表示进行推理的图像 output_dir="snapshot", # output_dir表示精度分析的输出目录 target=None, # target表示目标硬件平台 "rk3588" device_id=None, # device_id表示设备的编号 )

lrzss commented 2 months ago
                  cos      euc        cos      euc          

[Input] input1 1.00000 | 0.0 1.00000 | 0.0
[Conv] 657 1.00000 | 0.5736 1.00000 | 0.5736
[Relu] 523 1.00000 | 0.5536 1.00000 | 0.4844
[Conv] 660 1.00000 | 0.5772 1.00000 | 0.5392
[Relu] 526 1.00000 | 0.5534 1.00000 | 0.2353
[Conv] 663 1.00000 | 1.1479 1.00000 | 0.7906
[Add] 529 1.00000 | 1.3301 1.00000 | 0.7148
[Conv] 666 1.00000 | 2.2601 1.00000 | 1.0847
[Relu] 532 1.00000 | 1.7260 1.00000 | 0.2852
[Conv] 669 1.00000 | 1.3904 1.00000 | 0.4093
[Relu] 535 1.00000 | 1.0419 1.00000 | 0.1658
[Conv] 672 1.00000 | 1.3937 1.00000 | 0.3832
[Conv] 675 1.00000 | 0.5909 1.00000 | 0.1594
[Relu] 540 1.00000 | 0.5335 1.00000 | 0.1101
[Conv] 678 1.00000 | 1.0364 1.00000 | 0.3335
[Add] 543 1.00000 | 1.7957 1.00000 | 0.2607
[Conv] 681 1.00000 | 1.0310 1.00000 | 0.1746
[Relu] 546 1.00000 | 0.7903 1.00000 | 0.0958
[Conv] 684 0.99999 | 2.1299 1.00000 | 0.3457
[Add] 549 0.99999 | 2.9085 1.00000 | 0.2903
[Conv] 687 0.99999 | 2.6952 1.00000 | 0.3304
[Relu] 552 1.00000 | 1.7371 1.00000 | 0.1382
[Conv] 690 1.00000 | 1.1967 1.00000 | 0.2084
[Relu] 555 1.00000 | 0.8840 1.00000 | 0.0954
[Conv] 693 1.00000 | 1.1661 1.00000 | 0.2047
[Conv] 696 1.00000 | 0.3932 1.00000 | 0.0767
[Relu] 560 1.00000 | 0.3046 1.00000 | 0.0513
[Conv] 699 1.00000 | 0.5776 1.00000 | 0.1600
[Relu] 563 1.00000 | 0.3330 1.00000 | 0.0422
[Conv] 702 1.00000 | 0.4096 1.00000 | 0.0894
[Add] 566 1.00000 | 1.2412 1.00000 | 0.1517
[Conv] 705 1.00000 | 0.5913 1.00000 | 0.0984
[Relu] 569 1.00000 | 0.4283 1.00000 | 0.0530
[Conv] 708 1.00000 | 0.7901 1.00000 | 0.1791
[Relu] 572 1.00000 | 0.4629 1.00000 | 0.0512
[Conv] 711 1.00000 | 0.5908 1.00000 | 0.1100
[Add] 575 1.00000 | 1.4390 1.00000 | 0.1707
[Conv] 714 1.00000 | 0.6110 1.00000 | 0.1006
[Relu] 578 1.00000 | 0.4653 1.00000 | 0.0576
[Conv] 717 1.00000 | 0.9640 1.00000 | 0.2285
[Relu] 581 1.00000 | 0.5573 1.00000 | 0.0538
[Conv] 720 1.00000 | 0.7685 1.00000 | 0.1297
[Add] 584 1.00000 | 1.6833 1.00000 | 0.1952
[Conv] 723 1.00000 | 1.1044 1.00000 | 0.1573
[Relu] 587 1.00000 | 0.7509 1.00000 | 0.0684
[Conv] 726 1.00000 | 0.4214 1.00000 | 0.0823
[Relu] 590 1.00000 | 0.3579 1.00000 | 0.0613
[Conv] 729 1.00000 | 0.6598 1.00000 | 0.1579
[Conv] 732 1.00000 | 0.2598 1.00000 | 0.0462
[Relu] 595 1.00000 | 0.2023 1.00000 | 0.0255
[Conv] 735 1.00000 | 0.2893 1.00000 | 0.0587
[Relu] 598 1.00000 | 0.1654 1.00000 | 0.0153
[Conv] 738 1.00000 | 0.2219 1.00000 | 0.0371
[Add] 601 1.00000 | 0.6968 1.00000 | 0.0911
[Conv] 741 1.00000 | 0.3722 1.00000 | 0.0629
[Relu] 604 1.00000 | 0.2791 1.00000 | 0.0315
[Conv] 744 1.00000 | 0.4052 1.00000 | 0.0778
[Relu] 607 1.00000 | 0.2109 1.00000 | 0.0187
[Conv] 747 1.00000 | 0.2884 1.00000 | 0.0428
[Add] 610 1.00000 | 0.7394 1.00000 | 0.0966
[Conv] 750 1.00000 | 0.3642 1.00000 | 0.0680
[Relu] 613 1.00000 | 0.2284 1.00000 | 0.0259
[Conv] 753 1.00000 | 0.4001 1.00000 | 0.0725
[Relu] 616 1.00000 | 0.2047 1.00000 | 0.0171
[Conv] 756 0.99999 | 0.3022 1.00000 | 0.0422
[Add] 619 1.00000 | 0.7759 1.00000 | 0.0987
[Conv] 759 1.00000 | 0.4546 1.00000 | 0.0743
[Relu] 622 1.00000 | 0.3309 1.00000 | 0.0395
[Conv] 762 1.00000 | 0.6191 1.00000 | 0.1161
[Relu] 625 1.00000 | 0.4718 1.00000 | 0.0582
[Conv] 765 1.00000 | 0.5325 1.00000 | 0.0984
[Conv] 768 1.00000 | 0.4096 1.00000 | 0.0783
[Relu] 630 1.00000 | 0.2711 1.00000 | 0.0348
[Conv] 771 1.00000 | 0.4835 1.00000 | 0.1028
[Relu] 633 1.00000 | 0.2757 1.00000 | 0.0251
[Conv] 774 1.00000 | 0.3433 1.00000 | 0.0534
[Add] 636 1.00000 | 0.6301 1.00000 | 0.0919
[Conv] 777 1.00000 | 0.4245 1.00000 | 0.0767
[Relu] 639 1.00000 | 0.2581 1.00000 | 0.0297
[Conv] 780 1.00000 | 0.4421 1.00000 | 0.0907
[Relu] 642 1.00000 | 0.2487 1.00000 | 0.0229
[Conv] 783 1.00000 | 0.4102 1.00000 | 0.0634
[Add] 645 1.00000 | 0.7621 1.00000 | 0.1025
[Conv] 786 1.00000 | 0.3325 1.00000 | 0.0635
[Relu] 648 1.00000 | 0.2189 1.00000 | 0.0311
[Conv] 789 1.00000 | 0.3675 1.00000 | 0.0808
[Relu] 651 1.00000 | 0.1899 1.00000 | 0.0213
[Conv] 792 1.00000 | 0.3788 1.00000 | 0.0686
[Add] 654 1.00000 | 0.8469 1.00000 | 0.1136
[Conv] output.1 1.00000 | 0.2878 1.00000 | 0.0448

yuyun2000 commented 2 months ago

需要连板,这是模拟器的结果,这个精度看起来没有损失

lrzss commented 2 months ago

嗯嗯 我现在版本有点问题,跑不起来了,我先把环境搭好,再来请教(之前怀疑是rknpu的版本问题,把环境搞没了)

lrzss commented 2 months ago

layer_name simulator_error runtime_error
entire single entire single_sim
cos euc cos euc cos euc cos euc

[Input] input2 1.00000 | 0.0 1.00000 | 0.0 0.99995 | 0.1627 0.99995 | 0.1627
[Conv] 779 0.99999 | 1.0755 0.99999 | 1.0755
[Relu] 524 0.99998 | 1.0397 1.00000 | 0.7689 0.99997 | 1.4931 1.00000 | 1.3477
[Conv] 782 0.99997 | 0.8724 0.99996 | 0.7143
[Relu] 527 0.99997 | 0.8659 0.99999 | 0.4307 0.99996 | 1.3155 1.00000 | 0.0046
[Conv] 785 1.00000 | 2.5700 0.99995 | 1.8921
[Add] 530 1.00000 | 2.8986 1.00000 | 1.4192 1.00000 | 3.5750 1.00000 | 1.7143
[Conv] 788 0.99991 | 4.3304 0.99998 | 1.7811
[Relu] 533 1.00000 | 3.5543 0.99997 | 0.4015 1.00000 | 4.2574 1.00000 | 0.0081
[Conv] 791 1.00000 | 3.7970 1.00000 | 0.6894
[Relu] 536 1.00000 | 3.4721 1.00000 | 0.2775 0.99999 | 3.8574 1.00000 | 0.0041
[Conv] 794 1.00000 | 4.6669 1.00000 | 0.6468 0.99999 | 5.1509 1.00000 | 0.0051
[Conv] 797 0.99999 | 1.4841 1.00000 | 0.3100
[Relu] 541 0.99999 | 1.4627 1.00000 | 0.2369 0.99999 | 1.6763 1.00000 | 0.0004
[Conv] 800 0.99998 | 2.6342 1.00000 | 0.6616
[Add] 544 0.99998 | 5.0976 0.99999 | 0.3201 0.99998 | 5.9654 1.00000 | 0.2242
[Conv] 803 1.00000 | 1.7949 1.00000 | 0.2781
[Relu] 547 1.00000 | 1.4582 1.00000 | 0.1654 0.99999 | 1.7563 1.00000 | 0.0005
[Conv] 806 0.99992 | 4.1477 1.00000 | 0.6346
[Add] 550 0.99998 | 6.6513 0.99999 | 0.3351 0.99998 | 7.6858 1.00000 | 0.2762
[Conv] 809 0.99998 | 6.3415 1.00000 | 0.3971
[Relu] 553 0.99998 | 4.4356 1.00000 | 0.1916 1.00000 | 5.1056 1.00000 | 0.0055
[Conv] 812 0.99999 | 4.1514 1.00000 | 0.3703
[Relu] 556 0.99999 | 3.7521 1.00000 | 0.1947 0.99999 | 3.9819 1.00000 | 0.0021
[Conv] 815 0.99998 | 5.0604 1.00000 | 0.3820 0.99999 | 4.7388 1.00000 | 0.0051
[Conv] 818 0.99999 | 1.5985 1.00000 | 0.1333
[Relu] 561 1.00000 | 1.3061 1.00000 | 0.0875 0.99999 | 1.2282 1.00000 | 0.0008
[Conv] 821 0.99998 | 2.8758 1.00000 | 0.2680
[Relu] 564 0.99998 | 1.9675 1.00000 | 0.0714 0.99998 | 1.7026 1.00000 | 0.0039
[Conv] 824 0.99999 | 2.4641 1.00000 | 0.1452
[Add] 567 0.99997 | 5.8444 1.00000 | 0.2532 0.99998 | 5.2529 1.00000 | 0.2041
[Conv] 827 0.99999 | 2.8012 1.00000 | 0.1673
[Relu] 570 0.99999 | 2.3205 1.00000 | 0.1055 0.99999 | 2.0396 1.00000 | 0.0013
[Conv] 830 0.99998 | 4.3316 1.00000 | 0.3453
[Relu] 573 0.99998 | 2.8934 1.00000 | 0.0857 0.99999 | 2.3967 1.00000 | 0.0008
[Conv] 833 0.99994 | 3.7272 1.00000 | 0.1789
[Add] 576 0.99994 | 7.0802 1.00000 | 0.2232 0.99996 | 6.2152 1.00000 | 0.1724
[Conv] 836 0.99998 | 3.0335 1.00000 | 0.1314
[Relu] 579 0.99998 | 2.5853 1.00000 | 0.0914 0.99999 | 2.2117 1.00000 | 0.0011
[Conv] 839 0.99998 | 4.3081 1.00000 | 0.3882
[Relu] 582 0.99999 | 1.9032 1.00000 | 0.0846 0.99999 | 1.9840 1.00000 | 0.0005
[Conv] 842 0.99995 | 2.6459 1.00000 | 0.2060
[Add] 585 0.99993 | 7.6328 1.00000 | 0.1969 0.99994 | 6.9910 1.00000 | 0.1392
[Conv] 845 0.99995 | 4.7125 1.00000 | 0.1623
[Relu] 588 0.99996 | 2.8233 1.00000 | 0.0649 0.99996 | 2.6960 1.00000 | 0.0011
[Conv] 848 0.99999 | 2.3845 1.00000 | 0.1399
[Relu] 591 0.99999 | 2.1314 1.00000 | 0.1242 1.00000 | 1.8691 1.00000 | 0.0004
[Conv] 851 0.99996 | 3.9650 1.00000 | 0.3250 0.99997 | 3.2082 1.00000 | 0.0050
[Conv] 854 0.99998 | 1.4926 1.00000 | 0.0713
[Relu] 596 0.99998 | 1.1043 1.00000 | 0.0375 0.99999 | 0.9031 1.00000 | 0.0005
[Conv] 857 0.99995 | 2.8068 1.00000 | 0.0930
[Relu] 599 0.99992 | 1.5951 1.00000 | 0.0270 0.99996 | 1.0995 1.00000 | 0.0010
[Conv] 860 0.99997 | 1.6445 1.00000 | 0.0628
[Add] 602 0.99995 | 4.4660 1.00000 | 0.1362 0.99997 | 3.5387 1.00000 | 0.0881
[Conv] 863 0.99997 | 2.3713 1.00000 | 0.0962
[Relu] 605 0.99998 | 1.7126 1.00000 | 0.0509 0.99998 | 1.4499 1.00000 | 0.0015
[Conv] 866 0.99998 | 3.1378 1.00000 | 0.1404
[Relu] 608 0.99995 | 1.5366 1.00000 | 0.0330 0.99997 | 1.1849 1.00000 | 0.0028
[Conv] 869 0.99996 | 1.8629 1.00000 | 0.0813
[Add] 611 0.99994 | 4.9314 1.00000 | 0.1431 0.99996 | 3.8591 1.00000 | 0.1166
[Conv] 872 0.99996 | 2.8632 1.00000 | 0.0949
[Relu] 614 0.99997 | 1.6831 1.00000 | 0.0441 0.99998 | 1.3731 1.00000 | 0.0020
[Conv] 875 0.99997 | 3.0728 1.00000 | 0.1396
[Relu] 617 0.99995 | 1.6367 1.00000 | 0.0338 0.99996 | 1.3323 1.00000 | 0.0009
[Conv] 878 0.99996 | 2.2494 1.00000 | 0.0905
[Add] 620 0.99994 | 5.4847 1.00000 | 0.1584 0.99997 | 4.2795 1.00000 | 0.1280
[Conv] 881 0.99995 | 3.7616 1.00000 | 0.1159
[Relu] 623 0.99996 | 3.3158 1.00000 | 0.0755 0.99998 | 2.2758 1.00000 | 0.0023
[Conv] 884 0.99993 | 7.6051 1.00000 | 0.2326
[Relu] 626 0.99995 | 5.2190 1.00000 | 0.1109 0.99998 | 3.5104 1.00000 | 0.0012
[Conv] 887 0.99990 | 5.7242 1.00000 | 0.1727 0.99995 | 4.0718 1.00000 | 0.0041
[Conv] 890 0.99994 | 4.4906 1.00000 | 0.1168
[Relu] 631 0.99994 | 2.8980 1.00000 | 0.0548 0.99997 | 2.1243 1.00000 | 0.0023
[Conv] 893 0.99995 | 6.0398 1.00000 | 0.1800
[Relu] 634 0.99988 | 2.8003 1.00000 | 0.0385 0.99993 | 2.0606 1.00000 | 0.0011
[Conv] 896 0.99986 | 3.5745 1.00000 | 0.0830
[Add] 637 0.99989 | 6.7982 1.00000 | 0.1349 0.99994 | 4.8667 1.00000 | 0.1095
[Conv] 899 0.99993 | 4.5291 1.00000 | 0.1170
[Relu] 640 0.99993 | 3.0169 1.00000 | 0.0524 0.99996 | 2.1687 1.00000 | 0.0016
[Conv] 902 0.99994 | 6.5610 0.99999 | 0.1757
[Relu] 643 0.99988 | 3.5285 1.00000 | 0.0489 0.99994 | 2.5415 1.00000 | 0.0007
[Conv] 905 0.99984 | 5.6482 1.00000 | 0.1365
[Add] 646 0.99987 | 8.8883 1.00000 | 0.1677 0.99993 | 6.5569 1.00000 | 0.1436
[Conv] 908 0.99995 | 3.6204 1.00000 | 0.1063
[Relu] 649 0.99994 | 2.6540 1.00000 | 0.0513 0.99996 | 2.0353 1.00000 | 0.0014
[Conv] 911 0.99995 | 6.0365 1.00000 | 0.1695
[Relu] 652 0.99992 | 2.3043 1.00000 | 0.0394 0.99995 | 1.7914 1.00000 | 0.0004
[Conv] 914 0.99982 | 4.6603 1.00000 | 0.1294
[Add] 655 0.99987 | 10.128 1.00000 | 0.1843 0.99992 | 7.6849 1.00000 | 0.1455
[Conv] 917 0.99985 | 3.5293 1.00000 | 0.0709 0.99991 | 2.6603 1.00000 | 0.0023
[Conv] 670 0.99984 | 2.6902 1.00000 | 0.0462 0.99991 | 2.0124 1.00000 | 1.5258
[Conv] 920 0.99990 | 2.8363 1.00000 | 0.0570
[Relu] 673 0.99988 | 1.7877 1.00000 | 0.0241 0.99992 | 1.4911 1.00000 | 0.0003
[Reshape] 708_rs 0.99988 | 1.7877 1.00000 | 0.0241 0.99992 | 1.4911 1.00000 | 0.0
[Conv] 686 0.99985 | 2.0782 1.00000 | 0.0418 0.99991 | 1.5852 1.00000 | 0.0006
[Conv] 923 0.99992 | 2.3576 1.00000 | 0.0536
[Relu] 689 0.99992 | 1.6843 1.00000 | 0.0274 0.99994 | 1.3818 1.00000 | 0.0009
[Reshape] 740_rs 0.99992 | 1.6843 1.00000 | 0.0274 0.99994 | 1.3818 1.00000 | 0.0
[Input] input1 1.00000 | 0.0 1.00000 | 0.0 1.00000 | 0.0279 1.00000 | 0.0279
[Reshape] 669_rs 1.00000 | 0.0279 1.00000 | 0.0279 1.00000 | 0.0279 1.00000 | 0.0
[exMatMul] 709_mm 0.99991 | 17.185 1.00000 | 0.3937 0.99994 | 14.681 1.00000 | 0.0104
[Reshape] 716 0.99991 | 17.185 1.00000 | 0.2689 0.99994 | 14.681 1.00000 | 0.0
[Concat] 717 0.99991 | 17.277 1.00000 | 0.2700 0.99994 | 14.756 1.00000 | 0.0
[Conv] 718 0.99985 | 5.8374 1.00000 | 0.2120 0.99988 | 5.3110 1.00000 | 0.0001
[Conv] 926 0.99992 | 1.8950 1.00000 | 0.0457
[Relu] 721 0.99992 | 1.1237 1.00000 | 0.0188 0.99995 | 0.9366 1.00000 | 0.0006
[Conv] 767 0.99996 | 0.6037 1.00000 | 0.0211 0.99997 | 0.5079 1.00000 | 6.1035
[Conv] 938 0.99995 | 1.4171 1.00000 | 0.0534
[Relu] 770 0.99995 | 0.9582 1.00000 | 0.0207 0.99997 | 0.8171 1.00000 | 0.0012
[Conv] 771 0.99998 | 0.5420 1.00000 | 0.0236 0.99998 | 0.4684 1.00000 | 0.0001
[Conv] 941 0.99993 | 2.3373 1.00000 | 0.1043
[Relu] 774 0.99994 | 1.4226 1.00000 | 0.0268 0.99996 | 1.2197 1.00000 | 0.0011
[Conv] 775 0.99997 | 2.5461 1.00000 | 0.0870 0.99998 | 2.2666 1.00000 | 0.0
[Conv] output.2 1.00000 | 0.4268 1.00000 | 0.0245 1.00000 | 0.4016 1.00000 | 0.0
[exMatMul] 741_mm 0.99998 | 15.574 1.00000 | 0.6127 0.99998 | 15.312 1.00000 | 0.0019
[Reshape] 748 0.99998 | 15.574 1.00000 | 0.5193 0.99998 | 15.312 1.00000 | 0.0
[Concat] 749 0.99998 | 15.665 1.00000 | 0.5200 0.99998 | 15.374 1.00000 | 0.0
[Conv] 750 0.99997 | 5.5750 1.00000 | 0.3345 0.99997 | 5.4625 1.00000 | 0.0078
[Conv] 929 0.99997 | 1.4099 1.00000 | 0.0536
[Relu] 753 0.99997 | 0.9283 1.00000 | 0.0270 0.99998 | 0.8844 1.00000 | 0.0007
[Conv] 754 0.99999 | 0.5446 1.00000 | 0.0301 0.99999 | 0.5356 1.00000 | 0.0002
[Conv] 932 0.99998 | 1.1095 1.00000 | 0.0653
[Relu] 757 0.99998 | 0.8302 1.00000 | 0.0316 0.99999 | 0.8067 1.00000 | 0.0012
[Conv] 758 0.99999 | 0.5538 1.00000 | 0.0349 0.99999 | 0.5427 1.00000 | 3.0517
[Conv] 935 0.99997 | 1.7551 1.00000 | 0.0959
[Relu] 761 0.99998 | 1.2928 1.00000 | 0.0399 0.99998 | 1.2501 1.00000 | 0.0020
[Conv] 762 0.99996 | 2.7588 1.00000 | 0.0867 0.99997 | 2.5250 1.00000 | 0.0
[Conv] 765 0.99997 | 0.6171 1.00000 | 0.0187 0.99997 | 0.6163 1.00000 | 0.0001
[Exp] output.1 0.99997 | 4.6510 1.00000 | 0.3949 0.99997 | 4.5796 1.00000 | 0.0

lrzss commented 2 months ago

连板推理的结果也没啥损失 我用连板推理的方式跑程序是没问题的,跟踪的结果也都很好,转到c++部署,结果就不行了! 模型是fp16精度的,下面是模型输入的配置, 但是输出结果就是不对 模型1:
rknn_input rknn_img[1]; memset(rknn_img, 0, sizeof(rknn_img)); rknn_img[0].index = 0; rknn_img[0].type = RKNN_TENSOR_UINT8; // FLOAT32 rknn_img[0].size = z_crop.cols z_crop.rows z_crop.channels(); rknn_img[0].fmt = RKNN_TENSOR_NHWC; rknn_img[0].buf = z_crop.data; //rknn_img[0].pass_through = 0; rknn_inputs_set(net_z, 1, rknn_img); 模型2: x_img[0].index = 0; x_img[0].type = RKNN_TENSOR_FLOAT16; x_img[0].size = zf[0].size; x_img[0].fmt = RKNN_TENSOR_NHWC; x_img[0].buf = zf[0].buf;

x_img[1].index = 1;
x_img[1].type = RKNN_TENSOR_UINT8;
x_img[1].size = x_crop.cols * x_crop.rows * x_crop.channels();
x_img[1].fmt = RKNN_TENSOR_NHWC;
x_img[1].buf = x_crop.data;
rknn_inputs_set(net_x, 2, x_img);
lrzss commented 2 months ago

这个rknn-toolkit2 和 rknpu2 的推理应该没啥区别吧, 按理来说它们两者推理的结果应该一样或者差不多吧 差异这么大可能是啥原因啊

yuyun2000 commented 2 months ago

说明是你c++代码的问题,只要分析接口正确,那么表示这个rknn模型是没问题的

lrzss commented 2 months ago

好的 应该是我的问题 可能在前处理上!!感谢感谢