nlp:perplexity
This is an old revision of the document!
Table of Contents
Perplexity
Perplexity can be used to evaluate the performance of language models.
Conversion from log Probability to Perplexity
To convert from loss (-log probability, aka cross entropy) to perplexity exponentiate the negative loss (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)). When the loss is very low (< .1) then the perplexity is roughly 1 + loss.
| Loss (-log(prob)) | Perplexity |
|---|---|
| 6.9 | 1000 |
| 5.2 | 200 |
| 4.6 | 100 |
| 3.9 | 50 |
| 2.3 | 10 |
| 2.07 | 8 |
| 1.5 | 4.48 |
| 1 | 2.71 |
| .5 | 1.65 |
| .1 | 1.11 |
| .01 | 1.01 |
| 0 | 1 |
Related Pages
nlp/perplexity.1675277915.txt.gz · Last modified: 2023/06/15 07:36 (external edit)