nlp:rnn
This is an old revision of the document!
Table of Contents
Recurrent Neural Networks
RNN Variants
- Zhang et al 2019 - A Lightweight Recurrent Network for Sequence Modeling Related to the Transformer, LRNs are a drop-in replacement to other RNNs, which remove the sequential natural of RNN processing. It essentially uses a Key-Query-Value attention mechanism instead of the recurrence.
Theoretical Properties
- Merrill et al 2020 - A Formal Hierarchy of RNN Architectures Interesting, but possibly incorrect in practice because the theoretical analysis is for saturated RNNs (related follow-up here).
Converting RNNs to WFSAs
nlp/rnn.1619690385.txt.gz · Last modified: 2023/06/15 07:36 (external edit)