ashafahi / inceptionv3-transferLearn-poison

Attacking a dog vs fish classification that uses transfer learning inceptionV3
67 stars 20 forks source link

about the derivation of backward step #5

Open Tsingularity opened 4 years ago

Tsingularity commented 4 years ago

Hi, thanks a lot for releasing the code and this a really interesting paper!

But I am pretty confused about how do you derive the update rule for the backward step? I checked the reference given in the paper for the "forward-backward splitting algorithm" but still didn't get it. I mean I understand the proximal operator methods but I didn't see any clue how could I derive the backward update rule given by you from the proximal gradient descent.

I also see someone else has the same problem in the github issue. Could you give a brief response when you have time? Thanks in advance!

Really interesting work and paper!

Tsingularity commented 4 years ago

Ops sorry for my mistake. I think I know why because I just found your backward step is simply taking the optimal value for the proximal operator in that step.

But in this case, I just realized another simple (also might be really silly again) question, since g(x) is totally differentiable, why do you choose to optimize the original f(x)+g(x) with the "forward-backward splitting algorithm" instead of directly performing gradient descent?

Thanks!