graykode / gpt-2-Pytorch

Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
MIT License
963 stars 225 forks source link

Help Increasing the amount of training/fine-tuning text to about 10k words #20

Open sleekmike opened 4 years ago

sleekmike commented 4 years ago

Hello, I am trying to train/fine-tune the GPT-2 model using your wrapper, I have successfully made it to train by using a text file, however I would like to train the model with lots of text like 10 thousand words on a specific topic/domain and have it generate from 500-1000 words but I keep getting a strange error when I try it. Please how do I increase the amount of training/fine-tuning text from the current limit to about 10,000 words?