strand2013 / NNIE-lite

⚡️ Using NNIE as simple as using ncnn ⚡️
MIT License
183 stars 50 forks source link

ERROR: "caffe.LayerParameter" has no field named "bn_param" #10

Closed charlesyann closed 3 years ago

charlesyann commented 3 years ago

I train Enet with the repo https://github.com/TimoSaemann/ENet, finnally I got prototxt and model weights, but it has bn_param { scale_filler { type: "constant" value: 1.0 } shift_filler { type: "constant" value: 0.0010000000474974513 } bn_mode: INFERENCE } in the bn_conv_merged_model.prototxt, it will meets error, ./nnie_mapper_12 model_inst.cfg Mapper Version 1.2.2.0_B010 (NNIE_1.2) 19050917062183 begin net parsing.... [libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format caffe.NetParameter: 65:12: Message type "caffe.LayerParameter" has no field named "bn_param". How do you deal with bn_param?

strand2013 commented 3 years ago

I train Enet with the repo https://github.com/TimoSaemann/ENet, finnally I got prototxt and model weights, but it has bn_param { scale_filler { type: "constant" value: 1.0 } shift_filler { type: "constant" value: 0.0010000000474974513 } bn_mode: INFERENCE } in the bn_conv_merged_model.prototxt, it will meets error, ./nnie_mapper_12 model_inst.cfg Mapper Version 1.2.2.0_B010 (NNIE_1.2) 19050917062183 begin net parsing.... [libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format caffe.NetParameter: 65:12: Message type "caffe.LayerParameter" has no field named "bn_param". How do you deal with bn_param?

After ENet trained,you should merge the Conv layer and BN layer, specifically you should run tow scripts, first step is run ./scripts/compute_bn_statistics.py $trained_prototxt $trained_caffemodel $new_model_save_dir, second step is run ./scripts/BN-absorber-ent.py --model $deploy_prototxt --weights $new_model_save_dir/xxx.caffemodel --out_dir $final_save_dir

charlesyann commented 3 years ago

I train Enet with the repo https://github.com/TimoSaemann/ENet, finnally I got prototxt and model weights, but it has bn_param { scale_filler { type: "constant" value: 1.0 } shift_filler { type: "constant" value: 0.0010000000474974513 } bn_mode: INFERENCE } in the bn_conv_merged_model.prototxt, it will meets error, ./nnie_mapper_12 model_inst.cfg Mapper Version 1.2.2.0_B010 (NNIE_1.2) 19050917062183 begin net parsing.... [libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format caffe.NetParameter: 65:12: Message type "caffe.LayerParameter" has no field named "bn_param". How do you deal with bn_param?

After ENet trained,you should merge the Conv layer and BN layer, specifically you should run tow scripts, first step is run ./scripts/compute_bn_statistics.py $trained_prototxt $trained_caffemodel $new_model_save_dir, second step is run ./scripts/BN-absorber-ent.py --model $deploy_prototxt --weights $new_model_save_dir/xxx.caffemodel --out_dir $final_save_dir

Yes, I did, but the bn after concat and bn after deconv doesn't be merged in /scripts/BN-absorber-ent.py, so the bn_param remains. The main difference between us is you don't have bn_param in your bn_conv_merged_model.prototxt, but I and TimoSaemann/ENet does have.

strand2013 commented 3 years ago

I train Enet with the repo https://github.com/TimoSaemann/ENet, finnally I got prototxt and model weights, but it has bn_param { scale_filler { type: "constant" value: 1.0 } shift_filler { type: "constant" value: 0.0010000000474974513 } bn_mode: INFERENCE } in the bn_conv_merged_model.prototxt, it will meets error, ./nnie_mapper_12 model_inst.cfg Mapper Version 1.2.2.0_B010 (NNIE_1.2) 19050917062183 begin net parsing.... [libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format caffe.NetParameter: 65:12: Message type "caffe.LayerParameter" has no field named "bn_param". How do you deal with bn_param?

After ENet trained,you should merge the Conv layer and BN layer, specifically you should run tow scripts, first step is run ./scripts/compute_bn_statistics.py $trained_prototxt $trained_caffemodel $new_model_save_dir, second step is run ./scripts/BN-absorber-ent.py --model $deploy_prototxt --weights $new_model_save_dir/xxx.caffemodel --out_dir $final_save_dir

Yes, I did, but the bn after concat and bn after deconv doesn't be merged in /scripts/BN-absorber-ent.py, so the bn_param remains. The main difference between us is you don't have bn_param in your bn_conv_merged_model.prototxt, but I and TimoSaemann/ENet does have.

oh, Maybe I found the key point, you should change the BN layer's type, from type: "BN" to type:"BatchNorm" in your prototxt.

charlesyann commented 3 years ago

Yes, It might be the version of caffe, change bn param will solve the problem.