xlang-ai / UnifiedSKG

[EMNLP 2022] Unifying and multi-tasking structured knowledge grounding with language models
https://arxiv.org/abs/2201.05966
Apache License 2.0
550 stars 58 forks source link

About prefix tuning #15

Closed drxmy closed 2 years ago

drxmy commented 2 years ago

When doing the Multi-task prefix-tuning, in my understanding the T5 is not finetuned on any SKG task(or not finetuned at all?). Would it be better finetune it first and then prefix tuning it again? Just curious whether you have done some experiment about this.

Really wonderful work!

Timothyxxx commented 2 years ago

Hi,

Your understanding is right and that's indeed an interesting setting! We planned to do such experiments and add that to Table 4 of the paper, but we thought tuning foundation model like T5 doesn't fit in our blueprint which assume the super-large-model shouldn't be changed thus we didn't do so actually.(And of course we may make it up if reviewers require us to do so =)

Hope this information helpful! Thanks

drxmy commented 2 years ago

Thank you for the clarification.