eth-sri / eran

ETH Robustness Analyzer for Deep Neural Networks
Apache License 2.0
313 stars 100 forks source link

Why no back-substitution for ReLU layer? #98

Closed JacksonZyy closed 2 years ago

JacksonZyy commented 2 years ago

Hi,

I recently read a TACAS'21 paper that had a re-implementation of DeepPoly in Python. I noticed in their code that they also conducted back-substitution process for unstable ReLU neurons. Therefore I wonder why there is no back-substitution process for ReLU layer in DeepPoly? I would really appreciate it if you can reply to me at your earliest convenient time.

Thanks and regards, Yuyi

mnmueller commented 2 years ago

Hello @JacksonZyy,

Thank you for your interest in ERAN. There is in fact also a back substitution through unstable ReLUs in our DeepPoly implementation.

Cheers, Mark

JacksonZyy commented 2 years ago

Hi,

Thanks for the reply, I think you might misunderstand my question. I meant that, back-substitution (through intermediate ReLUs&affines, until input layer) process is designed to compute more precise concrete lower and upper bounds for affine neurons (according to my understanding). But why you don't use back-substitution process to compute concrete bound for ReLU neurons? Thank you.

mnmueller commented 2 years ago

Hi @JacksonZyy,

We actually also compute the intermediate bounds (as we call the concrete bounds to the inputs of activation neurons) via backsubstitution.

Cheers, Mark