gladzhang / ART

PyTorch code for our ICLR 2023 paper "Accurate Image Restoration with Attention Retractable Transformer".
Apache License 2.0
148 stars 17 forks source link

Tks for your brilliant works! Have you try to parallel 'DAB' and 'SAB' instead of putting them in sequence? #11

Open lyf1212 opened 1 year ago

lyf1212 commented 1 year ago

It is very sharp of you to discover the potential of sparse attention mechanism and purpose the DAB in your paper~ I'd like to know have you try to put the 'DAB' and 'SAB' in parallel when constructing the arch of the model instead of putting them in sequence?

gladzhang commented 1 year ago

I have not tried this. It may be nice to design a parallel structure to use DAB and SAB once you find a good motivation.