openlm-research / open_llama

OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
Apache License 2.0
7.29k stars 372 forks source link

May I ask about the configs of pre-training? For example, did you use dropout? #21

Closed joytianya closed 1 year ago

joytianya commented 1 year ago

May I ask about the configs of pre-training? For example, did you use dropout?

young-geng commented 1 year ago

We use the exact same hyperparameters as described in the original llama paper.

joytianya commented 1 year ago

I didn't find dropout in the paper. May I ask if llama has used dropout?