openlm-research / open_llama

OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
Apache License 2.0
7.36k stars 374 forks source link

Replicating Results #17

Closed usamayaseen-veeva closed 1 year ago

usamayaseen-veeva commented 1 year ago

Great work. Can you please share the configuration/values of sampling parameters (for each task) to replicate the results reported on the ReadMe page? Thanks!

young-geng commented 1 year ago

These tasks are evaluated using only log likelihood of the model, so no sampling hyperparameters are used.