THUDM / P-tuning

A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
MIT License
923 stars 111 forks source link

Question about the prompt encoder of PT-Fewshot #25

Closed ikm565 closed 3 years ago

Xiao9905 commented 3 years ago
  1. The use of prompt encoder is actually located in function generate_default_inputs: https://github.com/THUDM/P-tuning/blob/413f26642b1f29bed03514eff4700e89346d668a/PT-Fewshot/pet/wrapper.py#L578-L598

  2. Thanks for your reminder. This problem has been previously reported, and in fact we find it does not influence the results. But since it is confusing for many viewers, we have fixed it according to your advice.

ikm565 commented 3 years ago
  1. The use of prompt encoder is actually located in function generate_default_inputs: https://github.com/THUDM/P-tuning/blob/413f26642b1f29bed03514eff4700e89346d668a/PT-Fewshot/pet/wrapper.py#L578-L598
  2. Thanks for your reminder. This problem has been previously reported, and in fact we find it does not influence the results. But since it is confusing for many viewers, we have fixed it according to your advice.

thank you very much