Closed JacksonZyy closed 2 years ago
Hello @JacksonZyy,
Thank you for your interest in ERAN. There is in fact also a back substitution through unstable ReLUs in our DeepPoly implementation.
Cheers, Mark
Hi,
Thanks for the reply, I think you might misunderstand my question. I meant that, back-substitution (through intermediate ReLUs&affines, until input layer) process is designed to compute more precise concrete lower and upper bounds for affine neurons (according to my understanding). But why you don't use back-substitution process to compute concrete bound for ReLU neurons? Thank you.
Hi @JacksonZyy,
We actually also compute the intermediate bounds (as we call the concrete bounds to the inputs of activation neurons) via backsubstitution.
Cheers, Mark
Hi,
I recently read a TACAS'21 paper that had a re-implementation of DeepPoly in Python. I noticed in their code that they also conducted back-substitution process for unstable ReLU neurons. Therefore I wonder why there is no back-substitution process for ReLU layer in DeepPoly? I would really appreciate it if you can reply to me at your earliest convenient time.
Thanks and regards, Yuyi