NNgen / nngen

NNgen: A Fully-Customizable Hardware Synthesis Compiler for Deep Neural Network
Apache License 2.0
336 stars 46 forks source link

`Width of bit range is huge; vector of over 1billion bits: 0x40000000` occured in torchvision_onnx_resnet18 example #19

Open in-die-nibelungen opened 4 years ago

in-die-nibelungen commented 4 years ago

Hello.

I'm very interested in inference on FPGA. Found nngen and trying out some examples. cnn.py and mlp.py examples can be run without any problem, but an error, which is Width of bit range is huge; vector of over 1billion bits: 0x40000000, occured in torchvision_onnx_resnet18.py.It seems like an error by verilator.

My env is:

$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 18.04.4 LTS
Release:        18.04
Codename:       bionic

$ python --version
Python 3.6.9

$ pip freeze
attrs==19.3.0
cycler==0.10.0
future==0.18.2
importlib-metadata==1.6.1
Jinja2==2.11.2
kiwisolver==1.2.0
MarkupSafe==1.1.1
matplotlib==3.2.2
more-itertools==8.4.0
nngen==1.3.0
numpy==1.19.0
onnx==1.7.0
opencv-python==4.2.0.34
packaging==20.4
Pillow==7.1.2
pkg-resources==0.0.0
pluggy==0.13.1
protobuf==3.12.2
py==1.8.2
pyparsing==2.4.7
pytest==5.4.3
pytest-pythonpath==0.7.3
python-dateutil==2.8.1
pyverilog==1.2.1
six==1.15.0
torch==1.5.1
torchvision==0.6.1
tqdm==4.46.1
typing-extensions==3.7.4.2
veriloggen==1.8.2
wcwidth==0.2.5
zipp==3.1.0

Is this example working on your side? I really appreciate it if you could tell me how to fix or where should I check to fix this.

Thanks.

Here is the full log by torchvision_onnx_resnet18.py:

act_scale_factor=128
# mout: ['n03770679', 'minivan'] (656) = 13164.278320
# mout: ['n04037443', 'racer'] (751) = 13019.651367
# mout: ['n04285008', 'sports_car'] (817) = 12681.737305
# mout: ['n03776460', 'mobile_home'] (660) = 12418.727539
# mout: ['n02930766', 'cab'] (468) = 12218.459961
# mout: ['n02701002', 'ambulance'] (407) = 12133.000000
# mout: ['n03769881', 'minibus'] (654) = 11861.643555
# mout: ['n03100240', 'convertible'] (511) = 11282.865234
# mout: ['n04065272', 'recreational_vehicle'] (757) = 11121.644531
# mout: ['n03895866', 'passenger_car'] (705) = 11090.961914
# vout: ['n03770679', 'minivan'] (656) = 12860
# vout: ['n02930766', 'cab'] (468) = 12160
# vout: ['n04037443', 'racer'] (751) = 12140
# vout: ['n04285008', 'sports_car'] (817) = 11785
# vout: ['n03769881', 'minibus'] (654) = 11709
# vout: ['n02701002', 'ambulance'] (407) = 11654
# vout: ['n03445924', 'golfcart'] (575) = 11335
# vout: ['n03444034', 'go-kart'] (573) = 11139
# vout: ['n03776460', 'mobile_home'] (660) = 11118
# vout: ['n03100240', 'convertible'] (511) = 10743
# top-10 hit: 8
# top-10 score: 66
NNgen: Neural Network Accelerator Generator (version 1.3.0)
[IP-XACT]
  Output: resnet18
[Configuration]
(AXI Master Interface)
  Data width   : 32
  Address width: 32
(AXI Slave Interface)
  Data width   : 32
  Address width: 32
[Schedule Table]
(Stage 0)
(Stage 1)
  <conv2d Conv_0 dtype:int16 shape:(1, 112, 112, 64) strides:(1, 2, 2, 1) padding:(3, 3, 3, 3) bias:(64,) scale:(64,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:2 stationary:filter default_addr:12113600 g_index:0 l_index:1 word_alignment:2 aligned_shape:(1, 112, 112, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1142.533500>
  | <placeholder act dtype:int16 shape:(1, 224, 224, 3) default_addr:2048 g_index:2 word_alignment:2 aligned_shape:(1, 224, 224, 4) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:128.000000>
  | <variable conv1.weight dtype:int8 shape:(64, 7, 7, 3) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64, 7, 7, 4) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:124.941797>
  | <variable onnx_Conv_0_conv.bias dtype:int32 shape:(64,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(64,) scale_factor:15992.550033>
  | <variable onnx_Conv_0_conv.scale dtype:int8 shape:(64,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64,) scale_factor:292.624829>
(Stage 2)
  <max_pool MaxPool_3 dtype:int16 shape:(1, 56, 56, 64) ksize:(1, 3, 3, 1) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) default_addr:13719232 g_index:0 l_index:2 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1142.533500>
  | <conv2d Conv_0 dtype:int16 shape:(1, 112, 112, 64) strides:(1, 2, 2, 1) padding:(3, 3, 3, 3) bias:(64,) scale:(64,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:2 stationary:filter default_addr:12113600 g_index:0 l_index:1 word_alignment:2 aligned_shape:(1, 112, 112, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1142.533500>
(Stage 3)
  <conv2d Conv_4 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:11 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter default_addr:14120640 g_index:0 l_index:3 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:10191.864469>
  | <max_pool MaxPool_3 dtype:int16 shape:(1, 56, 56, 64) ksize:(1, 3, 3, 1) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) default_addr:13719232 g_index:0 l_index:2 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1142.533500>
  | <variable layer1.0.conv1.weight dtype:int8 shape:(64, 3, 3, 64) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64, 3, 3, 64) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:158.887437>
  | <variable onnx_Conv_4_conv.bias dtype:int32 shape:(64,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(64,) scale_factor:181534.219850>
  | <variable onnx_Conv_4_conv.scale dtype:int8 shape:(64,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64,) scale_factor:114.980737>
(Stage 4)
  <conv2d Conv_7 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:15 sum_dtype:int64 concur_och:4 stationary:filter default_addr:14522048 g_index:0 l_index:4 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4399.746405>
  | <conv2d Conv_4 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:11 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter default_addr:14120640 g_index:0 l_index:3 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:10191.864469>
  | <variable layer1.0.conv2.weight dtype:int8 shape:(64, 3, 3, 64) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64, 3, 3, 64) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:260.291823>
  | <variable onnx_Conv_7_conv.bias dtype:int32 shape:(64,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(64,) scale_factor:2652858.977967>
  | <variable onnx_Conv_7_conv.scale dtype:int8 shape:(64,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64,) scale_factor:54.345478>
(Stage 5)
  <relu Relu_10 dtype:int16 shape:(1, 56, 56, 64) default_addr:14923456 g_index:0 l_index:5 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:790.579432>
  | <scaled_add None dtype:int16 shape:(1, 56, 56, 64) a_scale:23 b_scale:87 shamt:7 default_addr:0 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:790.579432>
  | | <conv2d Conv_7 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:15 sum_dtype:int64 concur_och:4 stationary:filter default_addr:14522048 g_index:0 l_index:4 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4399.746405>
  | | <max_pool MaxPool_3 dtype:int16 shape:(1, 56, 56, 64) ksize:(1, 3, 3, 1) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) default_addr:13719232 g_index:0 l_index:2 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1142.533500>
(Stage 6)
  <conv2d Conv_11 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter default_addr:15324864 g_index:0 l_index:6 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:5659.140388>
  | <relu Relu_10 dtype:int16 shape:(1, 56, 56, 64) default_addr:14923456 g_index:0 l_index:5 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:790.579432>
  | <variable layer1.1.conv1.weight dtype:int8 shape:(64, 3, 3, 64) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64, 3, 3, 64) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:195.649934>
  | <variable onnx_Conv_11_conv.bias dtype:int32 shape:(64,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(64,) scale_factor:154676.813687>
  | <variable onnx_Conv_11_conv.scale dtype:int8 shape:(64,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64,) scale_factor:149.859817>
(Stage 7)
  <conv2d Conv_14 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:14 sum_dtype:int64 concur_och:4 stationary:filter default_addr:15726272 g_index:0 l_index:7 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4794.140635>
  | <conv2d Conv_11 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter default_addr:15324864 g_index:0 l_index:6 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:5659.140388>
  | <variable layer1.1.conv2.weight dtype:int8 shape:(64, 3, 3, 64) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64, 3, 3, 64) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:333.673567>
  | <variable onnx_Conv_14_conv.bias dtype:int32 shape:(64,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(64,) scale_factor:1888305.561384>
  | <variable onnx_Conv_14_conv.scale dtype:int8 shape:(64,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(64,) scale_factor:41.596658>
(Stage 8)
  <relu Relu_17 dtype:int16 shape:(1, 56, 56, 64) default_addr:16127680 g_index:0 l_index:8 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:599.267579>
  | <scaled_add None dtype:int16 shape:(1, 56, 56, 64) a_scale:16 b_scale:97 shamt:7 default_addr:0 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:599.267579>
  | | <conv2d Conv_14 dtype:int16 shape:(1, 56, 56, 64) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(64,) scale:(64,) cshamt_out:14 sum_dtype:int64 concur_och:4 stationary:filter default_addr:15726272 g_index:0 l_index:7 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4794.140635>
  | | <relu Relu_10 dtype:int16 shape:(1, 56, 56, 64) default_addr:14923456 g_index:0 l_index:5 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:790.579432>
(Stage 9)
  <conv2d Conv_23 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 2, 2, 1) padding:(0, 0, 0, 0) bias:(128,) scale:(128,) cshamt_out:11 sum_dtype:int64 concur_och:8 stationary:filter default_addr:16930496 g_index:0 l_index:11 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:3878.994621>
  | <relu Relu_17 dtype:int16 shape:(1, 56, 56, 64) default_addr:16127680 g_index:0 l_index:8 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:599.267579>
  | <variable layer2.0.downsample.0.weight dtype:int8 shape:(128, 1, 1, 64) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128, 1, 1, 64) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:162.773091>
  | <variable onnx_Conv_23_conv.bias dtype:int32 shape:(128,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(128,) scale_factor:97544.636106>
  | <variable onnx_Conv_23_conv.scale dtype:int8 shape:(128,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128,) scale_factor:81.441495>
(Stage 10)
  <conv2d Conv_18 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:13 act_func:relu sum_dtype:int64 concur_och:8 stationary:filter default_addr:16529088 g_index:0 l_index:9 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4050.211701>
  | <relu Relu_17 dtype:int16 shape:(1, 56, 56, 64) default_addr:16127680 g_index:0 l_index:8 word_alignment:2 aligned_shape:(1, 56, 56, 64) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:599.267579>
  | <variable layer2.0.conv1.weight dtype:int8 shape:(128, 3, 3, 64) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128, 3, 3, 64) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:372.939034>
  | <variable onnx_Conv_18_conv.bias dtype:int32 shape:(128,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(128,) scale_factor:223490.272442>
  | <variable onnx_Conv_18_conv.scale dtype:int8 shape:(128,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128,) scale_factor:148.459859>
(Stage 11)
  <conv2d Conv_21 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:14 sum_dtype:int64 concur_och:8 stationary:filter default_addr:16729792 g_index:0 l_index:10 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4283.457459>
  | <conv2d Conv_18 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:13 act_func:relu sum_dtype:int64 concur_och:8 stationary:filter default_addr:16529088 g_index:0 l_index:9 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4050.211701>
  | <variable layer2.0.conv2.weight dtype:int8 shape:(128, 3, 3, 128) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128, 3, 3, 128) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:297.312528>
  | <variable onnx_Conv_21_conv.bias dtype:int32 shape:(128,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(128,) scale_factor:1204178.677881>
  | <variable onnx_Conv_21_conv.scale dtype:int8 shape:(128,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128,) scale_factor:58.280526>
(Stage 12)
  <relu Relu_26 dtype:int16 shape:(1, 28, 28, 128) default_addr:17131200 g_index:0 l_index:12 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2141.728729>
  | <scaled_add None dtype:int16 shape:(1, 28, 28, 128) a_scale:128 b_scale:141 shamt:8 default_addr:0 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2141.728729>
  | | <conv2d Conv_21 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:14 sum_dtype:int64 concur_och:8 stationary:filter default_addr:16729792 g_index:0 l_index:10 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4283.457459>
  | | <conv2d Conv_23 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 2, 2, 1) padding:(0, 0, 0, 0) bias:(128,) scale:(128,) cshamt_out:11 sum_dtype:int64 concur_och:8 stationary:filter default_addr:16930496 g_index:0 l_index:11 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:3878.994621>
(Stage 13)
  <conv2d Conv_27 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:14 act_func:relu sum_dtype:int64 concur_och:8 stationary:filter default_addr:17331904 g_index:0 l_index:13 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4259.782735>
  | <relu Relu_26 dtype:int16 shape:(1, 28, 28, 128) default_addr:17131200 g_index:0 l_index:12 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2141.728729>
  | <variable layer2.1.conv1.weight dtype:int8 shape:(128, 3, 3, 128) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128, 3, 3, 128) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:289.459850>
  | <variable onnx_Conv_27_conv.bias dtype:int32 shape:(128,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(128,) scale_factor:619944.477381>
  | <variable onnx_Conv_27_conv.scale dtype:int8 shape:(128,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128,) scale_factor:112.578276>
(Stage 14)
  <conv2d Conv_30 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:14 sum_dtype:int64 concur_och:8 stationary:filter default_addr:17532608 g_index:0 l_index:14 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4165.645462>
  | <conv2d Conv_27 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:14 act_func:relu sum_dtype:int64 concur_och:8 stationary:filter default_addr:17331904 g_index:0 l_index:13 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4259.782735>
  | <variable layer2.1.conv2.weight dtype:int8 shape:(128, 3, 3, 128) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128, 3, 3, 128) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:357.085039>
  | <variable onnx_Conv_30_conv.bias dtype:int32 shape:(128,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(128,) scale_factor:1521104.684964>
  | <variable onnx_Conv_30_conv.scale dtype:int8 shape:(128,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(128,) scale_factor:44.868664>
(Stage 15)
  <relu Relu_33 dtype:int16 shape:(1, 28, 28, 128) default_addr:17733312 g_index:0 l_index:15 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1562.117048>
  | <scaled_add None dtype:int16 shape:(1, 28, 28, 128) a_scale:6 b_scale:11 shamt:4 default_addr:0 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1562.117048>
  | | <conv2d Conv_30 dtype:int16 shape:(1, 28, 28, 128) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(128,) scale:(128,) cshamt_out:14 sum_dtype:int64 concur_och:8 stationary:filter default_addr:17532608 g_index:0 l_index:14 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:4165.645462>
  | | <relu Relu_26 dtype:int16 shape:(1, 28, 28, 128) default_addr:17131200 g_index:0 l_index:12 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2141.728729>
(Stage 16)
  <conv2d Conv_39 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 2, 2, 1) padding:(0, 0, 0, 0) bias:(256,) scale:(256,) cshamt_out:13 sum_dtype:int64 concur_och:8 stationary:filter default_addr:18134720 g_index:0 l_index:18 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:6425.889559>
  | <relu Relu_33 dtype:int16 shape:(1, 28, 28, 128) default_addr:17733312 g_index:0 l_index:15 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1562.117048>
  | <variable layer3.0.downsample.0.weight dtype:int8 shape:(256, 1, 1, 128) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256, 1, 1, 128) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:474.500160>
  | <variable onnx_Conv_39_conv.bias dtype:int32 shape:(256,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(256,) scale_factor:741224.789310>
  | <variable onnx_Conv_39_conv.scale dtype:int8 shape:(256,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256,) scale_factor:71.018789>
(Stage 17)
  <conv2d Conv_34 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:13 act_func:relu sum_dtype:int64 concur_och:8 stationary:filter default_addr:17934016 g_index:0 l_index:16 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:8267.976923>
  | <relu Relu_33 dtype:int16 shape:(1, 28, 28, 128) default_addr:17733312 g_index:0 l_index:15 word_alignment:2 aligned_shape:(1, 28, 28, 128) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1562.117048>
  | <variable layer3.0.conv1.weight dtype:int8 shape:(256, 3, 3, 128) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256, 3, 3, 128) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:324.020764>
  | <variable onnx_Conv_34_conv.bias dtype:int32 shape:(256,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(256,) scale_factor:506158.359266>
  | <variable onnx_Conv_34_conv.scale dtype:int8 shape:(256,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256,) scale_factor:133.814380>
(Stage 18)
  <conv2d Conv_37 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:16 sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18034368 g_index:0 l_index:17 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2698.672026>
  | <conv2d Conv_34 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:13 act_func:relu sum_dtype:int64 concur_och:8 stationary:filter default_addr:17934016 g_index:0 l_index:16 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:8267.976923>
  | <variable layer3.0.conv2.weight dtype:int8 shape:(256, 3, 3, 256) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256, 3, 3, 256) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:380.793400>
  | <variable onnx_Conv_37_conv.bias dtype:int32 shape:(256,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(256,) scale_factor:3148391.043963>
  | <variable onnx_Conv_37_conv.scale dtype:int8 shape:(256,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256,) scale_factor:56.174779>
(Stage 19)
  <relu Relu_42 dtype:int16 shape:(1, 14, 14, 256) default_addr:18235072 g_index:0 l_index:19 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1606.472390>
  | <scaled_add None dtype:int16 shape:(1, 14, 14, 256) a_scale:152 b_scale:64 shamt:8 default_addr:0 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1606.472390>
  | | <conv2d Conv_37 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:16 sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18034368 g_index:0 l_index:17 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2698.672026>
  | | <conv2d Conv_39 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 2, 2, 1) padding:(0, 0, 0, 0) bias:(256,) scale:(256,) cshamt_out:13 sum_dtype:int64 concur_och:8 stationary:filter default_addr:18134720 g_index:0 l_index:18 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:6425.889559>
(Stage 20)
  <conv2d Conv_43 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:13 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18335424 g_index:0 l_index:20 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:7188.637914>
  | <relu Relu_42 dtype:int16 shape:(1, 14, 14, 256) default_addr:18235072 g_index:0 l_index:19 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1606.472390>
  | <variable layer3.1.conv1.weight dtype:int8 shape:(256, 3, 3, 256) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256, 3, 3, 256) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:428.993881>
  | <variable onnx_Conv_43_conv.bias dtype:int32 shape:(256,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(256,) scale_factor:689166.825513>
  | <variable onnx_Conv_43_conv.scale dtype:int8 shape:(256,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256,) scale_factor:85.450024>
(Stage 21)
  <conv2d Conv_46 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:15 sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18435776 g_index:0 l_index:21 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2946.116173>
  | <conv2d Conv_43 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:13 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18335424 g_index:0 l_index:20 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:7188.637914>
  | <variable layer3.1.conv2.weight dtype:int8 shape:(256, 3, 3, 256) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256, 3, 3, 256) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:385.377622>
  | <variable onnx_Conv_46_conv.bias dtype:int32 shape:(256,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(256,) scale_factor:2770340.182131>
  | <variable onnx_Conv_46_conv.scale dtype:int8 shape:(256,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(256,) scale_factor:34.847105>
(Stage 22)
  <relu Relu_49 dtype:int16 shape:(1, 14, 14, 256) default_addr:18536128 g_index:0 l_index:22 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1041.498100>
  | <scaled_add None dtype:int16 shape:(1, 14, 14, 256) a_scale:181 b_scale:331 shamt:9 default_addr:0 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1041.498100>
  | | <conv2d Conv_46 dtype:int16 shape:(1, 14, 14, 256) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(256,) scale:(256,) cshamt_out:15 sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18435776 g_index:0 l_index:21 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:2946.116173>
  | | <relu Relu_42 dtype:int16 shape:(1, 14, 14, 256) default_addr:18235072 g_index:0 l_index:19 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1606.472390>
(Stage 23)
  <conv2d Conv_55 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 2, 2, 1) padding:(0, 0, 0, 0) bias:(512,) scale:(512,) cshamt_out:12 sum_dtype:int64 concur_och:4 stationary:filter default_addr:18736832 g_index:0 l_index:25 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1794.669313>
  | <relu Relu_49 dtype:int16 shape:(1, 14, 14, 256) default_addr:18536128 g_index:0 l_index:22 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1041.498100>
  | <variable layer4.0.downsample.0.weight dtype:int8 shape:(512, 1, 1, 256) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512, 1, 1, 256) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:169.974670>
  | <variable onnx_Conv_55_conv.bias dtype:int32 shape:(512,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(512,) scale_factor:177028.296269>
  | <variable onnx_Conv_55_conv.scale dtype:int8 shape:(512,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512,) scale_factor:41.524240>
(Stage 24)
  <conv2d Conv_50 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18636480 g_index:0 l_index:23 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:9497.599254>
  | <relu Relu_49 dtype:int16 shape:(1, 14, 14, 256) default_addr:18536128 g_index:0 l_index:22 word_alignment:2 aligned_shape:(1, 14, 14, 256) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1041.498100>
  | <variable layer4.0.conv1.weight dtype:int8 shape:(512, 3, 3, 256) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512, 3, 3, 256) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:331.920946>
  | <variable onnx_Conv_50_conv.bias dtype:int32 shape:(512,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(512,) scale_factor:345695.034168>
  | <variable onnx_Conv_50_conv.scale dtype:int8 shape:(512,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512,) scale_factor:112.533195>
(Stage 25)
  <conv2d Conv_53 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:15 sum_dtype:int64 concur_och:2 stationary:filter keep_input default_addr:18686656 g_index:0 l_index:24 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:3226.543472>
  | <conv2d Conv_50 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 2, 2, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:4 stationary:filter keep_input default_addr:18636480 g_index:0 l_index:23 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:9497.599254>
  | <variable layer4.0.conv2.weight dtype:int8 shape:(512, 3, 3, 512) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512, 3, 3, 512) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:364.188553>
  | <variable onnx_Conv_53_conv.bias dtype:int32 shape:(512,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(512,) scale_factor:3458916.925484>
  | <variable onnx_Conv_53_conv.scale dtype:int8 shape:(512,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512,) scale_factor:30.566613>
(Stage 26)
  <relu Relu_58 dtype:int16 shape:(1, 7, 7, 512) default_addr:18787008 g_index:0 l_index:26 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1140.633532>
  | <scaled_add None dtype:int16 shape:(1, 7, 7, 512) a_scale:181 b_scale:325 shamt:9 default_addr:0 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1140.633532>
  | | <conv2d Conv_53 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:15 sum_dtype:int64 concur_och:2 stationary:filter keep_input default_addr:18686656 g_index:0 l_index:24 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:3226.543472>
  | | <conv2d Conv_55 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 2, 2, 1) padding:(0, 0, 0, 0) bias:(512,) scale:(512,) cshamt_out:12 sum_dtype:int64 concur_och:4 stationary:filter default_addr:18736832 g_index:0 l_index:25 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1794.669313>
(Stage 27)
  <conv2d Conv_59 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:2 stationary:filter keep_input default_addr:18837184 g_index:0 l_index:27 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:13532.546467>
  | <relu Relu_58 dtype:int16 shape:(1, 7, 7, 512) default_addr:18787008 g_index:0 l_index:26 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1140.633532>
  | <variable layer4.1.conv1.weight dtype:int8 shape:(512, 3, 3, 512) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512, 3, 3, 512) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:477.772058>
  | <variable onnx_Conv_59_conv.bias dtype:int32 shape:(512,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(512,) scale_factor:544962.830384>
  | <variable onnx_Conv_59_conv.scale dtype:int8 shape:(512,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512,) scale_factor:101.712093>
(Stage 28)
  <conv2d Conv_62 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:15 sum_dtype:int64 concur_och:2 stationary:filter keep_input default_addr:18887360 g_index:0 l_index:28 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1286.143315>
  | <conv2d Conv_59 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:12 act_func:relu sum_dtype:int64 concur_och:2 stationary:filter keep_input default_addr:18837184 g_index:0 l_index:27 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:13532.546467>
  | <variable layer4.1.conv2.weight dtype:int8 shape:(512, 3, 3, 512) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512, 3, 3, 512) layout:('O', 'H', 'W', 'I') onnx_layout:('O', 'I', 'H', 'W') scale_factor:467.304966>
  | <variable onnx_Conv_62_conv.bias dtype:int32 shape:(512,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(512,) scale_factor:6323826.164457>
  | <variable onnx_Conv_62_conv.scale dtype:int8 shape:(512,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(512,) scale_factor:6.664374>
(Stage 29)
  <relu Relu_65 dtype:int16 shape:(1, 7, 7, 512) default_addr:18937536 g_index:0 l_index:29 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:643.071657>
  | <scaled_add None dtype:int16 shape:(1, 7, 7, 512) a_scale:32 b_scale:36 shamt:6 default_addr:0 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:643.071657>
  | | <conv2d Conv_62 dtype:int16 shape:(1, 7, 7, 512) strides:(1, 1, 1, 1) padding:(1, 1, 1, 1) bias:(512,) scale:(512,) cshamt_out:15 sum_dtype:int64 concur_och:2 stationary:filter keep_input default_addr:18887360 g_index:0 l_index:28 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1286.143315>
  | | <relu Relu_58 dtype:int16 shape:(1, 7, 7, 512) default_addr:18787008 g_index:0 l_index:26 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:1140.633532>
(Stage 30)
  <avg_pool_serial GlobalAveragePool_66 dtype:int16 shape:(1, 1, 1, 512) ksize:(1, 7, 7, 1) strides:(1, 7, 7, 1) padding:(0, 0, 0, 0) no_reuse default_addr:18987712 g_index:0 l_index:30 word_alignment:2 aligned_shape:(1, 1, 1, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:643.071657>
  | <relu Relu_65 dtype:int16 shape:(1, 7, 7, 512) default_addr:18937536 g_index:0 l_index:29 word_alignment:2 aligned_shape:(1, 7, 7, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:643.071657>
(Stage 31)
  <_lazy_reshape Flatten_67 dtype:int16 shape:(1, 512) alias_of:GlobalAveragePool_66 default_addr:18987712 g_index:0 l_index:30 word_alignment:2 aligned_shape:(1, 512) scale_factor:643.071657>
  | <avg_pool_serial GlobalAveragePool_66 dtype:int16 shape:(1, 1, 1, 512) ksize:(1, 7, 7, 1) strides:(1, 7, 7, 1) padding:(0, 0, 0, 0) no_reuse default_addr:18987712 g_index:0 l_index:30 word_alignment:2 aligned_shape:(1, 1, 1, 512) layout:('N', 'H', 'W', 'C') onnx_layout:('N', 'C', 'H', 'W') scale_factor:643.071657>
(Stage 32)
  <matmul Gemm_68 dtype:int16 shape:(1, 1000) bias:(1000,) scale:(1,) cshamt_out:13 sum_dtype:int64 concur_out_col:2 stationary:right keep_left default_addr:0 g_index:1 word_alignment:2 aligned_shape:(1, 1000) scale_factor:1770.218673>
  | <_lazy_reshape Flatten_67 dtype:int16 shape:(1, 512) alias_of:GlobalAveragePool_66 default_addr:18987712 g_index:0 l_index:30 word_alignment:2 aligned_shape:(1, 512) scale_factor:643.071657>
  | <variable fc.weight dtype:int8 shape:(1000, 512) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(1000, 512) scale_factor:177.563530>
  | <variable fc.bias dtype:int32 shape:(1000,) default_addr:403456 g_index:3 word_alignment:1 aligned_shape:(1000,) scale_factor:114186.073760>
  | <variable onnx_Gemm_68_gemm.scale dtype:int8 shape:(1,) default_addr:403456 g_index:3 word_alignment:4 aligned_shape:(4,) scale_factor:127.000000>
[RAM (spec: num)]
  32-bit 1024-entry 2-port 1-bank RAM: 1
  32-bit 512-entry 2-port 1-bank RAM: 1
  32-bit 128-entry 2-port 1-bank RAM: 1
  16-bit 65536-entry 2-port 2-bank RAM: 1
  16-bit 8192-entry 2-port 2-bank RAM: 10
  16-bit 512-entry 2-port 2-bank RAM: 50
  8-bit 2048-entry 2-port 4-bank RAM: 50
[Substream (spec: num)]
  ('_max', (16, 0, True, 9)): 1
  ('acc_rshift_round_frac', (16, 0, True, 16, 0, True)): 1
  ('acc_rshift_round_frac', (64, 0, True, 64, 0, True)): 1
  ('add_tree', (64, 0, True, 1)): 1
  ('add_tree', (64, 0, True, 9)): 1
  ('add_tree', (64, 0, True, 49)): 1
  ('div_const', (16, 0, True, 7, 0, True)): 1
  ('mul_rshift_clip', (64, 0, True, 8, 0, True, 72, 0, True, 16, 0, True)): 1
  ('mul_rshift_round_madd', (16, 0, True, 8, 0, True, 24, 0, True)): 49
[Stream (spec: num)]
  (((<class 'nngen.operator.conv2d.conv2d'>, <dtype int16>, <dtype int8>, <dtype int32>, <dtype int8>), <dtype int16>, 1), 7, 7, None, <dtype int64>, 1, 1, 1, 1, 49, 49): 1
  (((<class 'nngen.operator.pool.max_pool'>, <dtype int16>), <dtype int16>, 1), 3, 3, 1): 1
  (((<class 'nngen.operator.conv2d.conv2d'>, <dtype int16>, <dtype int8>, <dtype int32>, <dtype int8>), <dtype int16>, 1), 3, 3, None, <dtype int64>, 1, 1, 1, 1, 9, 9): 1
  ((<class 'nngen.operator.relu.relu'>, ((<class 'nngen.operator.normalize.scaled_add'>, <dtype int16>, <dtype int16>), <dtype int16>, 1)), <dtype int16>, 1): 1
  (((<class 'nngen.operator.conv2d.conv2d'>, <dtype int16>, <dtype int8>, <dtype int32>, <dtype int8>), <dtype int16>, 1), 1, 1, None, <dtype int64>, 1, 1, 1, 1, 1, 1): 1
  ((((<class 'nngen.operator.pool_serial.avg_pool_serial'>, <dtype int16>), <dtype int16>, 1), 7, 7, True, 1), None, False): 1
  (((<class 'nngen.operator.basic._lazy_reshape'>, <dtype int16>), <dtype int16>, 1), True): 1
  (((<class 'nngen.operator.matmul.matmul'>, <dtype int16>, <dtype int8>, <dtype int32>, <dtype int8>), <dtype int16>, 1), 1, 1, None, <dtype int64>, 1, 1, 1, 1, 1, 1): 1
[Control (name (# states: num))]
  main_fsm (# states: 296)
  control_conv2d_105 (# states: 84)
  control_max_pool_107 (# states: 35)
  control_conv2d_110 (# states: 56)
  control_relu_116 (# states: 44)
  control_conv2d_135 (# states: 40)
  control_avg_pool_serial_189 (# states: 56)
  control_matmul_192 (# states: 40)
[Register Map]
   0 (R ): header0 (default: 0)
   4 (R ): header1 (default: 0)
   8 (R ): header2 (default: 0)
  12 (R ): header3 (default: 0)
  16 ( W): Start (set '1' to run)
  20 (R ): Busy (returns '1' when running)
  24 ( W): Reset (set '1' to initialize internal logic)
  28 (R ): Opcode from extern objects to SW (returns '0' when idle)
  32 ( W): Resume extern objects (set '1' to resume)
  36 (RW): Global address offset (default: 0)
  40 (RW): Address of temporal storages (size: 6714KB)
  44 (RW): Address of output (matmul) 'Gemm_68' (size: 2KB, dtype: int16, shape: (1, 1000), alignment: 2 words (4 bytes)), aligned shape: (1, 1000)
  48 (RW): Address of placeholder 'act' (size: 392KB, dtype: int16, shape: (1, 224, 224, 3), alignment: 2 words (4 bytes)), aligned shape: (1, 224, 224, 4)
  52 (RW): Address of variables 'conv1.weight', 'onnx_Conv_0_conv.bias', 'onnx_Conv_0_conv.scale', 'layer1.0.conv1.weight', 'onnx_Conv_4_conv.bias', 'onnx_Conv_4_conv.scale', 'layer1.0.conv2.weight', 'onnx_Conv_7_conv.bias', 'onnx_Conv_7_conv.scale', 'layer1.1.conv1.weight', 'onnx_Conv_11_conv.bias', 'onnx_Conv_11_conv.scale', 'layer1.1.conv2.weight', 'onnx_Conv_14_conv.bias', 'onnx_Conv_14_conv.scale', 'layer2.0.conv1.weight', 'onnx_Conv_18_conv.bias', 'onnx_Conv_18_conv.scale', 'layer2.0.conv2.weight', 'onnx_Conv_21_conv.bias', 'onnx_Conv_21_conv.scale', 'layer2.0.downsample.0.weight', 'onnx_Conv_23_conv.bias', 'onnx_Conv_23_conv.scale', 'layer2.1.conv1.weight', 'onnx_Conv_27_conv.bias', 'onnx_Conv_27_conv.scale', 'layer2.1.conv2.weight', 'onnx_Conv_30_conv.bias', 'onnx_Conv_30_conv.scale', 'layer3.0.conv1.weight', 'onnx_Conv_34_conv.bias', 'onnx_Conv_34_conv.scale', 'layer3.0.conv2.weight', 'onnx_Conv_37_conv.bias', 'onnx_Conv_37_conv.scale', 'layer3.0.downsample.0.weight', 'onnx_Conv_39_conv.bias', 'onnx_Conv_39_conv.scale', 'layer3.1.conv1.weight', 'onnx_Conv_43_conv.bias', 'onnx_Conv_43_conv.scale', 'layer3.1.conv2.weight', 'onnx_Conv_46_conv.bias', 'onnx_Conv_46_conv.scale', 'layer4.0.conv1.weight', 'onnx_Conv_50_conv.bias', 'onnx_Conv_50_conv.scale', 'layer4.0.conv2.weight', 'onnx_Conv_53_conv.bias', 'onnx_Conv_53_conv.scale', 'layer4.0.downsample.0.weight', 'onnx_Conv_55_conv.bias', 'onnx_Conv_55_conv.scale', 'layer4.1.conv1.weight', 'onnx_Conv_59_conv.bias', 'onnx_Conv_59_conv.scale', 'layer4.1.conv2.weight', 'onnx_Conv_62_conv.bias', 'onnx_Conv_62_conv.scale', 'fc.weight', 'fc.bias', 'onnx_Gemm_68_gemm.scale' (size: 11436KB)
[Default Memory Map (start - end)] (entire range: [0 - 18988735], size: 18544KB)
  [       0 -     2047]: output (matmul) 'Gemm_68' (size: 2KB, dtype: int16, shape: (1, 1000), alignment: 2 words (4 bytes)), aligned shape: (1, 1000)
  [    2048 -   403455]: placeholder 'act' (size: 392KB, dtype: int16, shape: (1, 224, 224, 3), alignment: 2 words (4 bytes)), aligned shape: (1, 224, 224, 4)
  [  403456 -   415999]: variable 'conv1.weight' (size: 13KB, dtype: int8, shape: (64, 7, 7, 3), alignment: 4 words (4 bytes)), aligned shape: (64, 7, 7, 4)
  [  416000 -   416255]: variable 'onnx_Conv_0_conv.bias' (size: 256B, dtype: int32, shape: (64,), alignment: 1 words (4 bytes)), aligned shape: (64,)
  [  416256 -   416319]: variable 'onnx_Conv_0_conv.scale' (size: 64B, dtype: int8, shape: (64,), alignment: 4 words (4 bytes)), aligned shape: (64,)
  [  416320 -   453183]: variable 'layer1.0.conv1.weight' (size: 36KB, dtype: int8, shape: (64, 3, 3, 64), alignment: 4 words (4 bytes)), aligned shape: (64, 3, 3, 64)
  [  453184 -   453439]: variable 'onnx_Conv_4_conv.bias' (size: 256B, dtype: int32, shape: (64,), alignment: 1 words (4 bytes)), aligned shape: (64,)
  [  453440 -   453503]: variable 'onnx_Conv_4_conv.scale' (size: 64B, dtype: int8, shape: (64,), alignment: 4 words (4 bytes)), aligned shape: (64,)
  [  453504 -   490367]: variable 'layer1.0.conv2.weight' (size: 36KB, dtype: int8, shape: (64, 3, 3, 64), alignment: 4 words (4 bytes)), aligned shape: (64, 3, 3, 64)
  [  490368 -   490623]: variable 'onnx_Conv_7_conv.bias' (size: 256B, dtype: int32, shape: (64,), alignment: 1 words (4 bytes)), aligned shape: (64,)
  [  490624 -   490687]: variable 'onnx_Conv_7_conv.scale' (size: 64B, dtype: int8, shape: (64,), alignment: 4 words (4 bytes)), aligned shape: (64,)
  [  490688 -   527551]: variable 'layer1.1.conv1.weight' (size: 36KB, dtype: int8, shape: (64, 3, 3, 64), alignment: 4 words (4 bytes)), aligned shape: (64, 3, 3, 64)
  [  527552 -   527807]: variable 'onnx_Conv_11_conv.bias' (size: 256B, dtype: int32, shape: (64,), alignment: 1 words (4 bytes)), aligned shape: (64,)
  [  527808 -   527871]: variable 'onnx_Conv_11_conv.scale' (size: 64B, dtype: int8, shape: (64,), alignment: 4 words (4 bytes)), aligned shape: (64,)
  [  527872 -   564735]: variable 'layer1.1.conv2.weight' (size: 36KB, dtype: int8, shape: (64, 3, 3, 64), alignment: 4 words (4 bytes)), aligned shape: (64, 3, 3, 64)
  [  564736 -   564991]: variable 'onnx_Conv_14_conv.bias' (size: 256B, dtype: int32, shape: (64,), alignment: 1 words (4 bytes)), aligned shape: (64,)
  [  564992 -   565055]: variable 'onnx_Conv_14_conv.scale' (size: 64B, dtype: int8, shape: (64,), alignment: 4 words (4 bytes)), aligned shape: (64,)
  [  565056 -   638783]: variable 'layer2.0.conv1.weight' (size: 72KB, dtype: int8, shape: (128, 3, 3, 64), alignment: 4 words (4 bytes)), aligned shape: (128, 3, 3, 64)
  [  638784 -   639295]: variable 'onnx_Conv_18_conv.bias' (size: 512B, dtype: int32, shape: (128,), alignment: 1 words (4 bytes)), aligned shape: (128,)
  [  639296 -   639423]: variable 'onnx_Conv_18_conv.scale' (size: 128B, dtype: int8, shape: (128,), alignment: 4 words (4 bytes)), aligned shape: (128,)
  [  639424 -   786879]: variable 'layer2.0.conv2.weight' (size: 144KB, dtype: int8, shape: (128, 3, 3, 128), alignment: 4 words (4 bytes)), aligned shape: (128, 3, 3, 128)
  [  786880 -   787391]: variable 'onnx_Conv_21_conv.bias' (size: 512B, dtype: int32, shape: (128,), alignment: 1 words (4 bytes)), aligned shape: (128,)
  [  787392 -   787519]: variable 'onnx_Conv_21_conv.scale' (size: 128B, dtype: int8, shape: (128,), alignment: 4 words (4 bytes)), aligned shape: (128,)
  [  787520 -   795711]: variable 'layer2.0.downsample.0.weight' (size: 8KB, dtype: int8, shape: (128, 1, 1, 64), alignment: 4 words (4 bytes)), aligned shape: (128, 1, 1, 64)
  [  795712 -   796223]: variable 'onnx_Conv_23_conv.bias' (size: 512B, dtype: int32, shape: (128,), alignment: 1 words (4 bytes)), aligned shape: (128,)
  [  796224 -   796351]: variable 'onnx_Conv_23_conv.scale' (size: 128B, dtype: int8, shape: (128,), alignment: 4 words (4 bytes)), aligned shape: (128,)
  [  796352 -   943807]: variable 'layer2.1.conv1.weight' (size: 144KB, dtype: int8, shape: (128, 3, 3, 128), alignment: 4 words (4 bytes)), aligned shape: (128, 3, 3, 128)
  [  943808 -   944319]: variable 'onnx_Conv_27_conv.bias' (size: 512B, dtype: int32, shape: (128,), alignment: 1 words (4 bytes)), aligned shape: (128,)
  [  944320 -   944447]: variable 'onnx_Conv_27_conv.scale' (size: 128B, dtype: int8, shape: (128,), alignment: 4 words (4 bytes)), aligned shape: (128,)
  [  944448 -  1091903]: variable 'layer2.1.conv2.weight' (size: 144KB, dtype: int8, shape: (128, 3, 3, 128), alignment: 4 words (4 bytes)), aligned shape: (128, 3, 3, 128)
  [ 1091904 -  1092415]: variable 'onnx_Conv_30_conv.bias' (size: 512B, dtype: int32, shape: (128,), alignment: 1 words (4 bytes)), aligned shape: (128,)
  [ 1092416 -  1092543]: variable 'onnx_Conv_30_conv.scale' (size: 128B, dtype: int8, shape: (128,), alignment: 4 words (4 bytes)), aligned shape: (128,)
  [ 1092544 -  1387455]: variable 'layer3.0.conv1.weight' (size: 288KB, dtype: int8, shape: (256, 3, 3, 128), alignment: 4 words (4 bytes)), aligned shape: (256, 3, 3, 128)
  [ 1387456 -  1388479]: variable 'onnx_Conv_34_conv.bias' (size: 1KB, dtype: int32, shape: (256,), alignment: 1 words (4 bytes)), aligned shape: (256,)
  [ 1388480 -  1388735]: variable 'onnx_Conv_34_conv.scale' (size: 256B, dtype: int8, shape: (256,), alignment: 4 words (4 bytes)), aligned shape: (256,)
  [ 1388736 -  1978559]: variable 'layer3.0.conv2.weight' (size: 576KB, dtype: int8, shape: (256, 3, 3, 256), alignment: 4 words (4 bytes)), aligned shape: (256, 3, 3, 256)
  [ 1978560 -  1979583]: variable 'onnx_Conv_37_conv.bias' (size: 1KB, dtype: int32, shape: (256,), alignment: 1 words (4 bytes)), aligned shape: (256,)
  [ 1979584 -  1979839]: variable 'onnx_Conv_37_conv.scale' (size: 256B, dtype: int8, shape: (256,), alignment: 4 words (4 bytes)), aligned shape: (256,)
  [ 1979840 -  2012607]: variable 'layer3.0.downsample.0.weight' (size: 32KB, dtype: int8, shape: (256, 1, 1, 128), alignment: 4 words (4 bytes)), aligned shape: (256, 1, 1, 128)
  [ 2012608 -  2013631]: variable 'onnx_Conv_39_conv.bias' (size: 1KB, dtype: int32, shape: (256,), alignment: 1 words (4 bytes)), aligned shape: (256,)
  [ 2013632 -  2013887]: variable 'onnx_Conv_39_conv.scale' (size: 256B, dtype: int8, shape: (256,), alignment: 4 words (4 bytes)), aligned shape: (256,)
  [ 2013888 -  2603711]: variable 'layer3.1.conv1.weight' (size: 576KB, dtype: int8, shape: (256, 3, 3, 256), alignment: 4 words (4 bytes)), aligned shape: (256, 3, 3, 256)
  [ 2603712 -  2604735]: variable 'onnx_Conv_43_conv.bias' (size: 1KB, dtype: int32, shape: (256,), alignment: 1 words (4 bytes)), aligned shape: (256,)
  [ 2604736 -  2604991]: variable 'onnx_Conv_43_conv.scale' (size: 256B, dtype: int8, shape: (256,), alignment: 4 words (4 bytes)), aligned shape: (256,)
  [ 2604992 -  3194815]: variable 'layer3.1.conv2.weight' (size: 576KB, dtype: int8, shape: (256, 3, 3, 256), alignment: 4 words (4 bytes)), aligned shape: (256, 3, 3, 256)
  [ 3194816 -  3195839]: variable 'onnx_Conv_46_conv.bias' (size: 1KB, dtype: int32, shape: (256,), alignment: 1 words (4 bytes)), aligned shape: (256,)
  [ 3195840 -  3196095]: variable 'onnx_Conv_46_conv.scale' (size: 256B, dtype: int8, shape: (256,), alignment: 4 words (4 bytes)), aligned shape: (256,)
  [ 3196096 -  4375743]: variable 'layer4.0.conv1.weight' (size: 1152KB, dtype: int8, shape: (512, 3, 3, 256), alignment: 4 words (4 bytes)), aligned shape: (512, 3, 3, 256)
  [ 4375744 -  4377791]: variable 'onnx_Conv_50_conv.bias' (size: 2KB, dtype: int32, shape: (512,), alignment: 1 words (4 bytes)), aligned shape: (512,)
  [ 4377792 -  4378303]: variable 'onnx_Conv_50_conv.scale' (size: 512B, dtype: int8, shape: (512,), alignment: 4 words (4 bytes)), aligned shape: (512,)
  [ 4378304 -  6737599]: variable 'layer4.0.conv2.weight' (size: 2304KB, dtype: int8, shape: (512, 3, 3, 512), alignment: 4 words (4 bytes)), aligned shape: (512, 3, 3, 512)
  [ 6737600 -  6739647]: variable 'onnx_Conv_53_conv.bias' (size: 2KB, dtype: int32, shape: (512,), alignment: 1 words (4 bytes)), aligned shape: (512,)
  [ 6739648 -  6740159]: variable 'onnx_Conv_53_conv.scale' (size: 512B, dtype: int8, shape: (512,), alignment: 4 words (4 bytes)), aligned shape: (512,)
  [ 6740160 -  6871231]: variable 'layer4.0.downsample.0.weight' (size: 128KB, dtype: int8, shape: (512, 1, 1, 256), alignment: 4 words (4 bytes)), aligned shape: (512, 1, 1, 256)
  [ 6871232 -  6873279]: variable 'onnx_Conv_55_conv.bias' (size: 2KB, dtype: int32, shape: (512,), alignment: 1 words (4 bytes)), aligned shape: (512,)
  [ 6873280 -  6873791]: variable 'onnx_Conv_55_conv.scale' (size: 512B, dtype: int8, shape: (512,), alignment: 4 words (4 bytes)), aligned shape: (512,)
  [ 6873792 -  9233087]: variable 'layer4.1.conv1.weight' (size: 2304KB, dtype: int8, shape: (512, 3, 3, 512), alignment: 4 words (4 bytes)), aligned shape: (512, 3, 3, 512)
  [ 9233088 -  9235135]: variable 'onnx_Conv_59_conv.bias' (size: 2KB, dtype: int32, shape: (512,), alignment: 1 words (4 bytes)), aligned shape: (512,)
  [ 9235136 -  9235647]: variable 'onnx_Conv_59_conv.scale' (size: 512B, dtype: int8, shape: (512,), alignment: 4 words (4 bytes)), aligned shape: (512,)
  [ 9235648 - 11594943]: variable 'layer4.1.conv2.weight' (size: 2304KB, dtype: int8, shape: (512, 3, 3, 512), alignment: 4 words (4 bytes)), aligned shape: (512, 3, 3, 512)
  [11594944 - 11596991]: variable 'onnx_Conv_62_conv.bias' (size: 2KB, dtype: int32, shape: (512,), alignment: 1 words (4 bytes)), aligned shape: (512,)
  [11596992 - 11597503]: variable 'onnx_Conv_62_conv.scale' (size: 512B, dtype: int8, shape: (512,), alignment: 4 words (4 bytes)), aligned shape: (512,)
  [11597504 - 12109503]: variable 'fc.weight' (size: 500KB, dtype: int8, shape: (1000, 512), alignment: 4 words (4 bytes)), aligned shape: (1000, 512)
  [12109504 - 12113535]: variable 'fc.bias' (size: 4KB, dtype: int32, shape: (1000,), alignment: 1 words (4 bytes)), aligned shape: (1000,)
  [12113536 - 12113599]: variable 'onnx_Gemm_68_gemm.scale' (size: 64B, dtype: int8, shape: (1,), alignment: 4 words (4 bytes)), aligned shape: (4,)
  [12113600 - 18988735]: temporal storages (size: 6714KB)
%Error: torchvision_onnx_resnet18.verilator.out/out.v:108: Width of bit range is huge; vector of over 1billion bits: 0x40000000
%Error: Exiting due to 1 error(s)
%Error: Command Failed /usr/bin/verilator_bin --cc -Wno-lint --Mdir torchvision_onnx_resnet18.verilator.out torchvision_onnx_resnet18.verilator.out/out.v --exe torchvision_onnx_resnet18.verilator.out/sim_test.cpp
make: Vout.mk: No such file or directory
make: *** No rule to make target 'Vout.mk'.  Stop.
/bin/sh: 1: ./torchvision_onnx_resnet18.verilator.out/Vout: not found
/pytorch/torch/csrc/utils/tensor_numpy.cpp:141: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program.
shtaxxx commented 3 years ago

I'm sorry for the very long delay in my response.

As you mentioned, this error comes from Verilator, and I didn't run the entire simulation using Verilator in the ResNet18 and VGG-11 cases. So I couldn't find the this error in my side.

If there is any option to increase the array size in Verilator, we can run this simulation. Do anyone know how to increase the maximum array size in Verilator?

in-die-nibelungen commented 3 years ago

@shtaxxx ,

Thanks for your comment.

As you mentioned, this error comes from Verilator, and I didn't run the entire simulation using Verilator in the ResNet18 and VGG-11 cases. So I couldn't find the this error in my side.

I got it. The, are you using another simulator or not run simulation in these models? Another option is to reduce the memory size.

Do anyone know how to increase the maximum array size in Verilator?

I also would like to know how to specify this for Verilator if it exists.