nlp:perplexity

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
nlp:perplexity [2023/02/01 18:58] – [Conversion from log Probability to Perplexity] jmflanignlp:perplexity [2023/06/15 07:36] (current) – external edit 127.0.0.1
Line 3: Line 3:
  
 ===== Conversion from log Probability to Perplexity ===== ===== Conversion from log Probability to Perplexity =====
-To convert from loss (-log probability, aka cross entropy) to perplexity exponentiate the negative loss (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)).  When the loss is very low (< .1) then the perplexity is roughly 1 + loss.+To convert from loss (-log probability, aka cross entropy) to perplexity exponentiate the loss (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)).  When the loss is very low (< .1) then the perplexity is roughly 1 + loss.
  
 ^ Loss (-log(prob)) ^ Perplexity ^ ^ Loss (-log(prob)) ^ Perplexity ^
Line 11: Line 11:
 | 3.9 | 50 | | 3.9 | 50 |
 | 2.3 | 10 | | 2.3 | 10 |
-| 2.07 |+| 2 | 7.38 |
 | 1.5 | 4.48 | | 1.5 | 4.48 |
 | 1 | 2.71 | | 1 | 2.71 |
nlp/perplexity.1675277915.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki