majianjia / nnom

A higher-level Neural Network library for microcontrollers.
Apache License 2.0
908 stars 245 forks source link

support dense before sequence rnn. #188

Closed bfs18 closed 1 year ago

bfs18 commented 1 year ago

Without this commit, the model and it io format are the following log.

Model version: 0.4.3
NNoM version 0.4.3
To disable logs, please void the marco 'NNOM_LOG(...)' in 'nnom_port.h'.
Data format: Channel last (HWC)
Start compiling model...
Layer(#)         Activation    output shape    ops(MAC)   mem(in, out, buf)      mem blk lifetime
-------------------------------------------------------------------------------------------------
#1   Input      -          - (   1,   1, 147,)          (   147,   147,     0)    1 - - -  - - - - 
#2   Dense      -          - (   8,          )     1176 (   147,     8,   294)    1 1 1 -  - - - - 
#3   RNN/GRU    -          - (   8,   8,     )     6336 (     8,    64,   160)    1 1 1 -  - - - - 
#4   RNN/GRU    -          - (   8,   8,     )     6336 (    64,    64,   160)    1 1 1 -  - - - - 
#5   Dense      -          - (   1,          )       64 (    64,     1,   128)    1 1 1 -  - - - - 
#6   Output     -          - (   1,          )          (     1,     1,     0)    - 1 - -  - - - - 
-------------------------------------------------------------------------------------------------
Memory cost by each block:
 blk_0:160  blk_1:296  blk_2:64  blk_3:0  blk_4:0  blk_5:0  blk_6:0  blk_7:0  
 Memory cost by network buffers: 520 bytes
 Total memory occupied: 2872 bytes
Compling done in 0 ms

Print layer input/output..
Layer(#)        -  Input(Qnm)  Output(Qnm)   Oshape 
----------------------------------------------------------
#1  Input      -    0. 7      0. 7      (   1,   1, 147,)
#2  Dense      -    0. 7      1. 6      (   8,          )
#3  RNN/GRU    -    1. 6      0. 7      (   8,   8,     )
#4  RNN/GRU    -    0. 7      0. 7      (   8,   8,     )
#5  Dense      -    0. 7      1. 6      (   1,          )
#6  Output     -    1. 6      1. 6      (   1,          )

This commit can fix the error in the first Dense layer.

Model version: 0.4.3
NNoM version 0.4.3
To disable logs, please void the marco 'NNOM_LOG(...)' in 'nnom_port.h'.
Data format: Channel last (HWC)
Start compiling model...
Layer(#)         Activation    output shape    ops(MAC)   mem(in, out, buf)      mem blk lifetime
-------------------------------------------------------------------------------------------------
#1   Input      -          - (   1,   1, 147,)          (   147,   147,     0)    1 - - -  - - - - 
#2   Dense      -          - (   1,   8,     )     1176 (   147,     8,   294)    1 1 1 -  - - - - 
#3   RNN/GRU    -          - (   1,   8,     )      792 (     8,     8,   160)    1 1 1 -  - - - - 
#4   RNN/GRU    -          - (   1,   8,     )      792 (     8,     8,   160)    1 1 1 -  - - - - 
#5   Dense      -          - (   1,          )        8 (     8,     1,    16)    1 1 1 -  - - - - 
#6   Output     -          - (   1,          )          (     1,     1,     0)    - 1 - -  - - - - 
-------------------------------------------------------------------------------------------------
Memory cost by each block:
 blk_0:160  blk_1:296  blk_2:8  blk_3:0  blk_4:0  blk_5:0  blk_6:0  blk_7:0  
 Memory cost by network buffers: 464 bytes
 Total memory occupied: 2816 bytes
Compling done in 0 ms

Print layer input/output..
Layer(#)        -  Input(Qnm)  Output(Qnm)   Oshape 
----------------------------------------------------------
#1  Input      -    0. 7      0. 7      (   1,   1, 147,)
#2  Dense      -    0. 7      1. 6      (   1,   8,     )
#3  RNN/GRU    -    1. 6      0. 7      (   1,   8,     )
#4  RNN/GRU    -    0. 7      0. 7      (   1,   8,     )
#5  Dense      -    0. 7      1. 6      (   1,          )
#6  Output     -    1. 6      1. 6      (   1,          )