Verified-Intelligence / alpha-beta-CROWN

alpha-beta-CROWN: An Efficient, Scalable and GPU Accelerated Neural Network Verifier (winner of VNN-COMP 2021, 2022, 2023, and 2024)
Other
243 stars 60 forks source link

Query of theorem of handling residual networks with ADD layer #43

Closed JacksonZyy closed 9 months ago

JacksonZyy commented 11 months ago

Dear,

I am very impressed when reading your papers regarding the theorems where you solve the constrained problems with efficient algorithms that run on GPU. In the theorem proofs, I notice that the original constrained problem includes constraints x(i) = W(i)x(i-1)+b(i), which only captures fully-connected/convolutional/... layers's behavior. But for an Add layer in residual networks in ONNX model, its function is like x(i) = x(i-1)+x(i-k). I fail to see how this theorem extends to residual networks, but I did observe residual networks in your experiments. So I wonder if there is a theorem behind handling the residual networks? And is this theorem (if any) just a customization of your existing theorem? Thank you in advance for your clarification!

shizhouxing commented 11 months ago

Hi @JacksonZyy , bound propagation for general computational graphs including ResNet is formulated in auto_LiRPA's paper: https://arxiv.org/abs/2002.12920