asahi417 / lmppl

Calculate perplexity on a text with pre-trained language models. Support MLM (eg. DeBERTa), recurrent LM (eg. GPT3), and encoder-decoder LM (eg. Flan-T5).
MIT License
134 stars 11 forks source link

Please Update Readme - Available Models #9

Open galonpy opened 1 year ago

galonpy commented 1 year ago

Hello! This is a great package, but it would help to know all the models its set up for. I see in the code references to GPT-2 XL and also GPT-4. Are the examples in the readme the only ones that should be used at this time?