Closed simonlevine closed 1 year ago
Hi Simon,
Sorry I didn't get notified by github regarding your comment! I would answer your questions below:
I hope I was able to help address your questions! Please feel free to add additional comments if you have any further questions.
Best, Zaixiang
Hi @zhengzx-nlp, thanks for your response. That all makes sense, and yes, as to 4., there is a section on page 15 of the manuscript that reads "...the structural adapter composes a multihead attention (MULTIHEAD ATTN) that queries structure information from the structure encoder...". This would seem to imply Q ~ Structure, rather than pLM.
Hi, I have a few questions:
Thank you very much.