ndb796 / Deep-Learning-Paper-Review-and-Practice

꼼꼼한 딥러닝 논문 리뷰와 코드 실습
1.05k stars 322 forks source link

Where's the dissim term on the loss? #1

Closed hihunjin closed 3 years ago

hihunjin commented 3 years ago

I'm looking onto your code, and I found that the loss function is a little different from the paper. In the paper, there's the Dissim term, but your code doesn't have it. Did you remove this on purpose?

Note: I meant loss=cost here

ndb796 commented 3 years ago

Thank you for the good question.

I didn't use the Dissim term, because my code is a 1-channel attack version. So, the output of the Dissim function is always 0. The author of the original paper mentioned that the 1-channel attack forces the values of each channel to have the same value.

In the source code, the get_ct() function is used to duplicate the values of a single channel into the values of 3-channels (r, g, b). The 1-channel attack is more constrained, it might be weaker than the 3-channel attack. But the perturbation of the 1-channel attack tends to seem like a more natural shadow.

Best regards, Dongbin Na

hihunjin commented 3 years ago

Thanks! I missed it, this is the 1-channel attack!