openlm-research / open_llama

OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
Apache License 2.0
7.29k stars 372 forks source link

Training with larger context length? #3

Open cksac opened 1 year ago

cksac commented 1 year ago

Is there any plan training with larger context length? Which will make open_llama better than original llama

ehartford commented 1 year ago

we really need 8k if not 32k context, is there a way to adjust the architecture to permit for that?

ehartford commented 1 year ago

see that mpt-7b has up to 64k context using ALiBi