Closed pangchao-git closed 3 years ago
查看 darknet2caffe.py脚本 发现,cfg中的prelu激活层被转化为了relu层,导致caffe模型精度下降,请问如何修改转化脚本支持转换 后的caffe模型激活层是prelu if block['activation'] != 'linear': activate_layer = OrderedDict() activate_layer['bottom'] = bottom activate_layer['top'] = bottom if block.has_key('name'): activate_layer['name'] = '%s-act' % block['name'] else: activate_layer['name'] = 'layer%d-act' % layer_id if block['activation'] == 'leaky': activate_layer['type'] = 'ReLU' relu_param = OrderedDict() relu_param['negative_slope'] = '0.1' activate_layer['relu_param'] = relu_param elif block['activation'] == 'mish': activate_layer['type'] = 'Mish'
查看 darknet2caffe.py脚本 发现,cfg中的prelu激活层被转化为了relu层,导致caffe模型精度下降,请问如何修改转化脚本支持转换 后的caffe模型激活层是prelu if block['activation'] != 'linear': activate_layer = OrderedDict() activate_layer['bottom'] = bottom activate_layer['top'] = bottom if block.has_key('name'): activate_layer['name'] = '%s-act' % block['name'] else: activate_layer['name'] = 'layer%d-act' % layer_id if block['activation'] == 'leaky': activate_layer['type'] = 'ReLU' relu_param = OrderedDict() relu_param['negative_slope'] = '0.1' activate_layer['relu_param'] = relu_param elif block['activation'] == 'mish': activate_layer['type'] = 'Mish'