User Tools

Site Tools


nlp:perplexity

Perplexity

Perplexity can be used to evaluate the performance of language models.

Conversion from log Probability to Perplexity

To convert from loss (-log probability, aka cross entropy) to perplexity exponentiate the loss (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)). When the loss is very low (< .1) then the perplexity is roughly 1 + loss.

Loss (-log(prob)) Perplexity
6.9 1000
5.2 200
4.6 100
3.9 50
2.3 10
2 7.38
1.5 4.48
1 2.71
.5 1.65
.1 1.11
.01 1.01
0 1
nlp/perplexity.txt · Last modified: 2023/06/15 07:36 by 127.0.0.1

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki