Open WDQGO opened 1 year ago
Layer (type) Output Shape Param #
================================================================ Conv2d-1 [-1, 40, 640, 640] 1,080 BatchNorm2d-2 [-1, 40, 640, 640] 80 SiLU-3 [-1, 40, 640, 640] 0 SiLU-4 [-1, 40, 640, 640] 0 SiLU-5 [-1, 40, 640, 640] 0 SiLU-6 [-1, 40, 640, 640] 0 SiLU-7 [-1, 40, 640, 640] 0 SiLU-8 [-1, 40, 640, 640] 0 SiLU-9 [-1, 40, 640, 640] 0 SiLU-10 [-1, 40, 640, 640] 0 SiLU-11 [-1, 40, 640, 640] 0 SiLU-12 [-1, 40, 640, 640] 0 SiLU-13 [-1, 40, 640, 640] 0 SiLU-14 [-1, 40, 640, 640] 0 SiLU-15 [-1, 40, 640, 640] 0 SiLU-16 [-1, 40, 640, 640] 0 SiLU-17 [-1, 40, 640, 640] 0 SiLU-18 [-1, 40, 640, 640] 0 SiLU-19 [-1, 40, 640, 640] 0 SiLU-20 [-1, 40, 640, 640] 0 SiLU-21 [-1, 40, 640, 640] 0 SiLU-22 [-1, 40, 640, 640] 0 SiLU-23 [-1, 40, 640, 640] 0 SiLU-24 [-1, 40, 640, 640] 0 SiLU-25 [-1, 40, 640, 640] 0 SiLU-26 [-1, 40, 640, 640] 0 SiLU-27 [-1, 40, 640, 6..........................................
貌似是pytorch显示的问题……最后的参数量是对的。
================================================================ Conv2d-1 [-1, 40, 640, 640] 1,080 BatchNorm2d-2 [-1, 40, 640, 640] 80 SiLU-3 [-1, 40, 640, 640] 0 SiLU-4 [-1, 40, 640, 640] 0 SiLU-5 [-1, 40, 640, 640] 0 SiLU-6 [-1, 40, 640, 640] 0 SiLU-7 [-1, 40, 640, 640] 0 SiLU-8 [-1, 40, 640, 640] 0 SiLU-9 [-1, 40, 640, 640] 0 SiLU-10 [-1, 40, 640, 640] 0 SiLU-11 [-1, 40, 640, 640] 0 SiLU-12 [-1, 40, 640, 640] 0 SiLU-13 [-1, 40, 640, 640] 0 SiLU-14 [-1, 40, 640, 640] 0 SiLU-15 [-1, 40, 640, 640] 0 SiLU-16 [-1, 40, 640, 640] 0 SiLU-17 [-1, 40, 640, 640] 0 SiLU-18 [-1, 40, 640, 640] 0 SiLU-19 [-1, 40, 640, 640] 0 SiLU-20 [-1, 40, 640, 640] 0 SiLU-21 [-1, 40, 640, 640] 0 SiLU-22 [-1, 40, 640, 640] 0 SiLU-23 [-1, 40, 640, 640] 0 SiLU-24 [-1, 40, 640, 640] 0 SiLU-25 [-1, 40, 640, 640] 0 SiLU-26 [-1, 40, 640, 640] 0 SiLU-27 [-1, 40, 640, 6..........................................