Suppose in resnet, there is an activation function after add the shortcut. However, in tl, ElementwiseLayer doesn't have activation function. As I know, the way to add the activation function after ElementwiseLayer is to get the outputs and then use tf.add to add the two tensor. Finally use the InputLayer to convert the tf tensor to tl layer. Is there any elegant way to do that? I think activation layer which is implemented in keras, pytorch etc. is an optional choice.
Suppose in resnet, there is an activation function after add the shortcut. However, in tl, ElementwiseLayer doesn't have activation function. As I know, the way to add the activation function after ElementwiseLayer is to get the outputs and then use tf.add to add the two tensor. Finally use the InputLayer to convert the tf tensor to tl layer. Is there any elegant way to do that? I think activation layer which is implemented in keras, pytorch etc. is an optional choice.