ZhengxiangShi / DePT

[ICLR 2024] This is the repository for the paper titled "DePT: Decomposed Prompt Tuning for Parameter-Efficient Fine-tuning"
http://arxiv.org/abs/2309.05173
MIT License
94 stars 15 forks source link

Source Tasks for Few-Shot Experiments #12

Open khalilbalaree opened 3 months ago

khalilbalaree commented 3 months ago

Thanks for the insightful paper! I am currently reproducing the few-shot experiments as reported in the paper and have encountered a point of clarification regarding pretraining prompts on source tasks. Could you please specify which source task the prompts and low-rank pairs were pretrained on for the CoLA target task? And what are the learning steps and learning rate should I use? Thanks for your help.