CodedotAl / gpt-code-clippy

Full description can be found here: https://discuss.huggingface.co/t/pretrain-gpt-neo-for-open-source-github-copilot-model/7678?u=ncoop57
Apache License 2.0
3.29k stars 220 forks source link

Training and fine-tuning on GPT-J #63

Closed uSaiPrashanth closed 2 years ago

uSaiPrashanth commented 2 years ago

Trying to fine-tune gpt-j to create a better version of code-clippy

Fine-tuning script has already been created by me. However, it would require a beefy tpu(v3-256 takes about 6 weeks I believe.) And thus, I cannot train it

It would be great if this repository would be helpful in the long run of creating an open source version of github-copilot

ncoop57 commented 2 years ago

Thanks for opening this issue. We are definitely wanting to finetune a GPT-J model so any help with that would be greatly appreciated. We are also looking into how to do trainings on beefy TPUs. If you'd like to discuss more on figuring out a way to train an open source copilot/codex model with us, please join us on our discord: https://discord.gg/68NZFfxHxD

Our main contributors are on a pause due to other projects and relaxing, but we will hopefully ramp up work on this project next week or so

uSaiPrashanth commented 2 years ago

Now that gpt-j code clippy is under active development, I'm closing this issue