Lightning-AI / lit-llama

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
Apache License 2.0
6k stars 520 forks source link

(documentation) error on readme.md about (Facebook's) LLama's license #448

Closed maathieu closed 1 year ago

maathieu commented 1 year ago

Hi,

The license used on the original LLama developed by Facebook is not the GPL but a specific Facebook-designed license called the LLAMA 2 Community License Agreement.

Can you update the readme?

Many thanks in advance,

mathieu

rasbt commented 1 year ago

LLAMA 2 Community License Agreement

Thanks for the note! But I think the original LLaMA license remains unchanged unless I missed something. So here this is the original LLaMA. For Llama 2, it's implemented here: https://github.com/Lightning-AI/lit-gpt