Closed yuchen3890 closed 6 months ago
Hi, thanks for pointing this out. The max length of 348 for the MultiRC dataset originates from previous works, such as ATTEMPT (https://aclanthology.org/2022.emnlp-main.446.pdf) and the reason behind this specific number isn't clear. I fully agree that the number of trainable parameters will increase in this case, thank you so much for bringing this to our attention. This can be a potential limitation of our study, and we will discuss it in the revised version.
Ok I see. Thank you for the reply!
Hello, thank you for the nice work! I would like to know, why the max length of the SuperGLUE-MultiRC dataset is 348 rather than 256? Is there any findings behind this setting? Besides, will the number of trainable parameters increase using max length=384 while the number of appended prompts remains 60 and the number of r for lora remains 30? Thank you very much!