Closed qinhui99 closed 5 years ago
Dear @qinhui99 , I apologize for late response. self.Conv_1x1 layer was added to fit the channel depth of x. you can remove self.Conv_1x1 if the channel depth of x is same with the channel depth of x1
If you have no further questions, I will close the issue. I hope my answer was helpful.
I check this function and see the x has been changed by x = self.Conv_1x1(x). So "return x+x1" maybe a bug.
def forward(self,x): x = self.Conv_1x1(x) x1 = self.RCNN(x) return x+x1