Open himsR opened 7 years ago
Have you tried it? Did the back propagation work successfully when you ad this layer?
Yes, the operation is differentiable
I think in the forward pass, ROI pooling crops and pools, and in backward pass gradients are concatenated from different crops ?
Hi ,
Thanks for providing the roi pool layer in tensorflow. I have a question regarding the backprop. As we know all the ops in tensorflow are differentiable. I mean if I add an op in my graph tensorflow will be able to backprop through that op during the optimization process. Is your implementation of roi pool differentiable ?I mean if I add this in my graph do I need to worry about implementing the back propagation of gradients through this op?
Thanks