eth-sri / probabilistic-forecasts-attacks

Apache License 2.0
30 stars 11 forks source link

Gradient Relationship between "aux_estimate.grad", "mean.grad" and "perturbation.grad"? #6

Closed HenryPengZou closed 3 years ago

HenryPengZou commented 3 years ago

https://github.com/eth-sri/probabilistic-forecasts-attacks/blob/df78ffd7375089bb7b38fbf55bb840b7699365eb/Finance/attack/attacks.py#L91-L116

Hi, one more thing.

I suppose the first loss.backward() here calculates ''mean.grad'' , aux_estimate.backward() calculates "aux_estimate.grad", and the second loss.backward() computes 'perturbation.grad'. Could you provide the relationship between these three gradient?

My guess is : perturbation.grad = mean.grad * aux_estimate.grad, is this correct? i.e.: CodeCogsEqn

rdang-nhu commented 3 years ago

Hi. This is related to issues #4 and #5. The loss function to differentiate is

Screenshot 2021-07-23 at 17 34 23
  1. aux_estimate.backward() computes the gradient mean (the expectation) -> perturbation (delta) using the score function estimator and places it in pertubation.grad
  2. The first loss.backward() computes the gradient loss (obj) -> mean (the expectation) and places it in mean.grad, which is then multiplied to pertubation.grad to apply the chain rule

Together, 1. and 2. compute the gradient loss (obj) -> perturbation (delta), through the function phi that measures distance to the target t

  1. The second loss.backward() computes the gradient loss (obj) -> perturbation (delta) , and adds it topertubation.grad but only through the norm. (when calling .backward() several times, the different gradients are accumulated by summation).

This convoluted way of computing the gradient is needed because with the score function estimator, we can not rely on autograd to compute 1. automatically.

HenryPengZou commented 3 years ago

Great explanation, very clear! At first, I was confused by 'aux_estimate' computes 'mean.grad'. But my doubt and questions are solved by your most answer, very clear, like it. I appreciate your effort in editing this answer times and times to make it become more clear and perfect. Thanks again!🌷