Open thb1314 opened 5 months ago
Hi
For the teacher model, the network (including its CBAM parameters) is trained. During the student's training process, all teacher parameters are frozen, and the student's parameters are trained, including its attention module. The CBAM module is defined in the Deeplab model, and like all other parameters of the network, they are updated during the backpropagation process.
These details are available in the paper's implementation details section.
Please let me know if there is any ambiguity.
After i read your paper and codes, i found that the parameters of CBAM module can not be trained, Why?