THUDM / P-tuning-v2

An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
Apache License 2.0
1.96k stars 198 forks source link

Can you provide the GPT2 example? #45

Closed PoodleWang closed 1 year ago

PoodleWang commented 1 year ago

Would you be able to provide the gpt2 example for p-tuning v2?

PoodleWang commented 1 year ago

@Xiao9905

Xiao9905 commented 1 year ago

@PoodleWang Hi,

We do not implement p-tuning v2 for GPT-style models in this work. However, you can refer to Prefix-Tuning's implementation, which is for GPT models on generation tasks only.