CodedotAl / gpt-code-clippy

Full description can be found here: https://discuss.huggingface.co/t/pretrain-gpt-neo-for-open-source-github-copilot-model/7678?u=ncoop57
Apache License 2.0
3.29k stars 220 forks source link

Fine-tuning on GPT-J #87

Open BalajiAJ opened 1 year ago

BalajiAJ commented 1 year ago

Hi,

Are there any recommended steps or resources available for fine-tuning a large language model such as GPT-J in an unsupervised manner using GPT-Code-Clippy, with the goal of teaching the model about a new domain?

Thanks