several layers don't using relu in forward function. here
Are these layers no need to using relu in the FPN structure? in the multipose paper, they didn't say they using relu in some parts, but for smooth, they said using 3*3 conv and relu, it seems you forget it.
in the paper, they using two 3*3 conv to do the transform form m2 to k2, in your model there are only one.
Maybe I have some misunderstanding portion, sorry to bother you and looking forward to your reply.
Maybe I have some misunderstanding portion, sorry to bother you and looking forward to your reply.