asahi417 / lmppl

Calculate perplexity on a text with pre-trained language models. Support MLM (eg. DeBERTa), recurrent LM (eg. GPT3), and encoder-decoder LM (eg. Flan-T5).
MIT License
134 stars 11 forks source link

Can the LLAMA model be used for this project, please #8

Open KangkangStu opened 1 year ago

Rishab9991 commented 6 months ago

Hi. In addition to the LLaMa models, are the bloom models compatible?

Thanks.