Closed uSaiPrashanth closed 3 years ago
Thanks for opening this issue. We are definitely wanting to finetune a GPT-J model so any help with that would be greatly appreciated. We are also looking into how to do trainings on beefy TPUs. If you'd like to discuss more on figuring out a way to train an open source copilot/codex model with us, please join us on our discord: https://discord.gg/68NZFfxHxD
Our main contributors are on a pause due to other projects and relaxing, but we will hopefully ramp up work on this project next week or so
Now that gpt-j code clippy is under active development, I'm closing this issue
Trying to fine-tune gpt-j to create a better version of code-clippy
Fine-tuning script has already been created by me. However, it would require a beefy tpu(v3-256 takes about 6 weeks I believe.) And thus, I cannot train it
It would be great if this repository would be helpful in the long run of creating an open source version of github-copilot