User Tools

Site Tools


nlp:perplexity

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
nlp:perplexity [2023/02/01 19:00] – [Conversion from log Probability to Perplexity] jmflanignlp:perplexity [2023/06/15 07:36] (current) – external edit 127.0.0.1
Line 3: Line 3:
  
 ===== Conversion from log Probability to Perplexity ===== ===== Conversion from log Probability to Perplexity =====
-To convert from loss (-log probability, aka cross entropy) to perplexity exponentiate the negative loss (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)).  When the loss is very low (< .1) then the perplexity is roughly 1 + loss.+To convert from loss (-log probability, aka cross entropy) to perplexity exponentiate the loss (perplexity = exp(loss)), and vice-versa (loss = ln(perplexity)).  When the loss is very low (< .1) then the perplexity is roughly 1 + loss.
  
 ^ Loss (-log(prob)) ^ Perplexity ^ ^ Loss (-log(prob)) ^ Perplexity ^
nlp/perplexity.1675278004.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki