AetherCortex / Llama-X

Open Academic Research on Improving LLaMA to SOTA LLM
Apache License 2.0
1.59k stars 101 forks source link

Does this repository support training of the GPT series? #1

Open Sierkinhane opened 1 year ago

victorsungo commented 1 year ago

As we know, GPT series are not currently open-source. This project mainly focuses on the optimization of LLaMA model, which was released by Meta AI. In the meanwhile, we treat GPT series as our baseline, and will compare each version of Llama-X with them on evaluation.

Sierkinhane commented 1 year ago

Pre-trained weights of GPT1 and GPT2 are available but GPT3 is not. Great works! Thanks for your contributions.