nlp:perplexity

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
nlp:perplexity [2023/02/01 18:53] – created jmflanignlp:perplexity [2023/06/15 07:36] (current) – external edit 127.0.0.1
Line 3: Line 3:
  
 ===== Conversion from log Probability to Perplexity ===== ===== Conversion from log Probability to Perplexity =====
-To convert from loss (log probability, aka cross entropy) to perplexity exponentiate (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)).  +To convert from loss (-log probability, aka cross entropy) to perplexity exponentiate the loss (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)).  When the loss is very low (< .1) then the perplexity is roughly 1 + loss.
- +
  
 +^ Loss (-log(prob)) ^ Perplexity ^
 +| 6.9 | 1000 |
 +| 5.2 | 200 |
 +| 4.6 | 100 |
 +| 3.9 | 50 |
 +| 2.3 | 10 |
 +| 2 | 7.38 |
 +| 1.5 | 4.48 |
 +| 1 | 2.71 |
 +| .5 | 1.65 |
 +| .1 | 1.11 |
 +| .01 | 1.01  |
 +| 0 | 1 |
  
 ===== Related Pages ===== ===== Related Pages =====
nlp/perplexity.1675277583.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki