nlp:non-autoregressive_seq2seq
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| nlp:non-autoregressive_seq2seq [2021/02/24 02:19] – [Key Papers] jmflanig | nlp:non-autoregressive_seq2seq [2024/05/03 03:37] (current) – [Key Papers] jmflanig | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ====== Non-Autoregressive Sequence-to-Sequence Models ====== | ====== Non-Autoregressive Sequence-to-Sequence Models ====== | ||
| Non-autoregressive seq2seq models produce outputs in parallel rather than one word at a time. | Non-autoregressive seq2seq models produce outputs in parallel rather than one word at a time. | ||
| + | |||
| + | ===== Autoregressive vs Non-Autoregressive ===== | ||
| + | **Definition: | ||
| + | generates tokens conditioned on the sequence of tokens previously generated. | ||
| + | |||
| + | **Definition: | ||
| + | |||
| + | **Note:** There are also **global models** like [[ml: | ||
| ===== Summary ===== | ===== Summary ===== | ||
| Line 28: | Line 36: | ||
| * [[https:// | * [[https:// | ||
| - | * **[[https:// | + | |
| + | | ||
| * [[https:// | * [[https:// | ||
| * [[https:// | * [[https:// | ||
| Line 35: | Line 44: | ||
| * [[https:// | * [[https:// | ||
| * **[[https:// | * **[[https:// | ||
| + | * [[https:// | ||
| + | * [[https:// | ||
| + | |||
| + | ===== Papers ===== | ||
| + | * [[https:// | ||
| + | |||
nlp/non-autoregressive_seq2seq.1614133165.txt.gz · Last modified: 2023/06/15 07:36 (external edit)