Open pgr2015 opened 5 years ago
My fix would be like this, quite straight forward. But I have strong concern that this change will break original logic of the source code.
if activation.__name__ != 'linear':
if maybe_layer.get_output_shape_at(node_index) != output_shape:
ValueError('The activation layer ({0}), does not have the same'
' output shape as {1}'.format(maybe_layer.name,
layer.name))
return maybe_layer, node_index
if activation.__name__ == 'linear':
if maybe_layer.get_output_shape_at(node_index) != output_shape:
ValueError('The activation layer ({0}), does not have the same'
' output shape as {1}'.format(maybe_layer.name,
layer.name))
return maybe_layer, node_index
Hi, there are some linear activation layers in my model. Since the keras-surgeon won't take linear activation layer into account, the pruning for my model failed. Here is the code in Keras-Surgeon: https://github.com/BenWhetton/keras-surgeon/blob/master/src/kerassurgeon/utils.py#L143 My question is then. Why only none-linear activation will be accepted? What if I add the if-statement for linear-activation? Will it cause problem to the original logic of the source code? Any idea? Thanks in advance!