User Tools

Site Tools


nlp:non-autoregressive_seq2seq

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
nlp:non-autoregressive_seq2seq [2021/11/15 05:34] – [Key Papers] jmflanignlp:non-autoregressive_seq2seq [2024/05/03 03:37] (current) – [Key Papers] jmflanig
Line 45: Line 45:
   * **[[https://arxiv.org/pdf/2004.07437.pdf|Non-Autoregressive Machine Translation with Latent Alignments]]** From Google, uses CTC loss from Gu & Kong 2020   * **[[https://arxiv.org/pdf/2004.07437.pdf|Non-Autoregressive Machine Translation with Latent Alignments]]** From Google, uses CTC loss from Gu & Kong 2020
   * [[https://arxiv.org/pdf/2006.10369.pdf|Kasai et al 2020 - Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation]]   * [[https://arxiv.org/pdf/2006.10369.pdf|Kasai et al 2020 - Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation]]
 +  * [[https://arxiv.org/pdf/2404.12022|Wu et al 2024 - Parallel Decoding via Hidden Transfer for Lossless Large Language Model Acceleration]]
 +
 +===== Papers =====
 +  * [[https://arxiv.org/pdf/2305.10427.pdf|Santilli et al 2023 - Accelerating Transformer Inference for Translation via Parallel Decoding]]
 +
nlp/non-autoregressive_seq2seq.1636954497.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki