Closed ivanmkc closed 6 months ago
@ivanmkc Perplexity ranges between zero and inf because the exponent can be negative (The sum of negative log likelihoods). Check out the following blog post for a better understanding. Perplexity of fixed-length models.
Thanks, will take a look.
https://github.com/huggingface/evaluate/blob/8dfe05784099fb9af55b8e77793205a3b7c86465/metrics/perplexity/README.md?plain=1#L60
perplexity = e**(sum(losses) / num_tokenized_tokens)
If sum(losses) = 0, then perplexity = 1.