I am very impressed when reading your papers regarding the theorems where you solve the constrained problems with efficient algorithms that run on GPU.
In the theorem proofs, I notice that the original constrained problem includes constraints x(i) = W(i)x(i-1)+b(i), which only captures fully-connected/convolutional/... layers's behavior.
But for an Add layer in residual networks in ONNX model, its function is like x(i) = x(i-1)+x(i-k). I fail to see how this theorem extends to residual networks, but I did observe residual networks in your experiments.
So I wonder if there is a theorem behind handling the residual networks? And is this theorem (if any) just a customization of your existing theorem?
Thank you in advance for your clarification!
Hi @JacksonZyy , bound propagation for general computational graphs including ResNet is formulated in auto_LiRPA's paper: https://arxiv.org/abs/2002.12920
Dear,
I am very impressed when reading your papers regarding the theorems where you solve the constrained problems with efficient algorithms that run on GPU. In the theorem proofs, I notice that the original constrained problem includes constraints x(i) = W(i)x(i-1)+b(i), which only captures fully-connected/convolutional/... layers's behavior. But for an Add layer in residual networks in ONNX model, its function is like x(i) = x(i-1)+x(i-k). I fail to see how this theorem extends to residual networks, but I did observe residual networks in your experiments. So I wonder if there is a theorem behind handling the residual networks? And is this theorem (if any) just a customization of your existing theorem? Thank you in advance for your clarification!