cvcode18 / imbalanced_learning

104 stars 20 forks source link

about kernel_size=7 or 14 #13

Open dmuqlzhang opened 5 years ago

dmuqlzhang commented 5 years ago

Im not understand about kernel_size=7 or 14 before the last FCN layer in attention module, its not reflect in the paper(DIAA), could you please give me some explain if you free? thanks a lot!

dmuqlzhang commented 5 years ago

and I want to understand why let kernel_size == feature map size in attention last layer?