openai / gpt-2

Code for the paper "Language Models are Unsupervised Multitask Learners"
https://openai.com/blog/better-language-models/
Other
22.57k stars 5.53k forks source link

Dose the pre-training data also use this prompt structure related to downstream tasks? #305

Open Aurora-slz opened 2 years ago

Aurora-slz commented 2 years ago

I read the gpt2 paper, but not sure whether the pre-training data from WebText will add format information. For example, we konw data format will be english sentence = french sentencein the translation task. So during pre-training time, will we add similar promt to the training data?

Thanks!

joan126 commented 1 year ago

interested about this