AkihikoWatanabe / paper_notes

たまに追加される論文メモ
https://AkihikoWatanabe.github.io/paper_notes
17 stars 0 forks source link

Focused Prefix Tuning for Controllable Text Generation, ACL'23 #834

Open AkihikoWatanabe opened 1 year ago

AkihikoWatanabe commented 1 year ago

https://virtual2023.aclweb.org/paper_P1495.html

AkihikoWatanabe commented 1 year ago

In a controllable text generation dataset, there exist unannotated attributes that could provide irrelevant learning signals to models that use it for training and thus degrade their performance. We propose focused prefix tuning (FPT) to mitigate the problem and to enable the control to focus on the desired attribute. Experimental results show that FPT can achieve better control accuracy and text fluency than baseline models in single-attribute control tasks. In multi-attribute control tasks, FPT achieves comparable control accuracy with the state-of-the-art approach while keeping the flexibility to control new attributes without retraining existing models.

Translation (by gpt-3.5-turbo)