THUDM / P-tuning

A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
MIT License
912 stars 111 forks source link

original, lama, or shared? #31

Closed One-punch24 closed 2 years ago

One-punch24 commented 2 years ago

Can I ask about how to chose the vocabulary setting, as the setting contains three choices (original, lama, shared) ? What do they exactly mean and how to set it for reproduce the LAMA outcome in the experiment? Thanks a lot and look forward to your answer.