Closed niaoyu closed 6 years ago
Hi,@niaoyu
It's not a functional layer, it's a network, which means the backward part is already implemented in the layer I used(such as pooling layer). What I was doing is just splicing those layers. If you are interested at creating a new layer, you can create a new repository and I'm willing to help.
@yueruchen Thanks for your reply! I just realize the function used in your code. So we just need to implement the backward method while writing some functional layers. Is that right?
@niaoyu I can't answer this question immediately, you can learn the detail from other repositories.
I just wonder if it is suitable for defining a module '-SPP_NET ' without backward method. Because in the forward method, you use self-defined layer 'spp_layer ' which does not have a backward implementation. As shown in http://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html Usually in pytorch, if we want to defining new autograd function, we should implement the forward and backward together?