In your paper, it is shown that the final quality score is obtained my combining the weighting branch and scoring branch.
The only difference between the two branches seems to be the final activation function i.e. ReLu and Sigmoid.
Can you please share the intuition regarding the architecture of the two branches?
In the competition, we regard that different patches has different weights for the final assessment. So, we add Sigmoid function to learn the patches weight.
Hi
In your paper, it is shown that the final quality score is obtained my combining the weighting branch and scoring branch. The only difference between the two branches seems to be the final activation function i.e. ReLu and Sigmoid. Can you please share the intuition regarding the architecture of the two branches?