ZhangYuanhan-AI / NOAH

[TPAMI] Searching prompt modules for parameter-efficient transfer learning.
MIT License
225 stars 11 forks source link

Question about Adapter #14

Closed RaptorMai closed 1 year ago

RaptorMai commented 1 year ago

Thank you so much for sharing the code. In the original paper of the Adapter, one Adapter layer is placed after the Attention module and the other one is placed after the FF layer. In your paper, it seems the Adapter layer is only placed after the FF layer. Could you elaborate on the rationale behind this change? Thank you so much in advance, and I look forward to hearing back from you.

ZhangYuanhan-AI commented 1 year ago

Hi Mai,

We follow the setting mentioned in the VPT paper: https://arxiv.org/pdf/2203.12119.pdf page 15.

As it said: [63,64] exhaustively searched all possible configurations and found that only inserting adapters after the FFN “Add & LayerNorm”sub-layer works the best. Therefore we also use this setup in our own implementation.