graykode / gpt-2-Pytorch

Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
MIT License
963 stars 225 forks source link

Pytorch Finetuning #6

Closed jjbrophy47 closed 5 years ago

jjbrophy47 commented 5 years ago

Hi, I really like this repository and how easy it use to use as a PyTorch alternative for GPT2. In this pull request, I've added the ability to fine-tune a pre-trained gpt2 model in PyTorch. I've adapted training code from nshepperd: https://github.com/nshepperd/gpt-2/blob/finetuning/train.py

I hope you find this useful! Let me know if you have any questions or concerns! -Jonathan Brophy

graykode commented 5 years ago

Hello. First, Thank you for your sincere pull request! I will read your code in line by line as much as possible. 👍

graykode commented 5 years ago

I read your code quickly and I have some questions. I left code review comment on line corresponding to the question.

graykode commented 5 years ago

branch name archive is branch to record before commits, so could you repull request to train branch? Thank you again.

graykode commented 5 years ago

Thanks! I will also reflect the modifications to the master branch.