Calculate perplexity on a text with pre-trained language models. Support MLM (eg. DeBERTa), recurrent LM (eg. GPT3), and encoder-decoder LM (eg. Flan-T5).
Hello! This is a great package, but it would help to know all the models its set up for. I see in the code
references to GPT-2 XL and also GPT-4. Are the examples in the readme the only ones that should be used at this time?
Hello! This is a great package, but it would help to know all the models its set up for. I see in the code references to GPT-2 XL and also GPT-4. Are the examples in the readme the only ones that should be used at this time?