Calculate perplexity on a text with pre-trained language models. Support MLM (eg. DeBERTa), recurrent LM (eg. GPT3), and encoder-decoder LM (eg. Flan-T5).
MIT License
123
stars
10
forks
source link
Updated README with some examples of models and their corresponding model type #11
Thanks Zara!