nlp:rnn
Table of Contents
Recurrent Neural Networks
RNN Variants
- Zhang et al 2019 - A Lightweight Recurrent Network for Sequence Modeling Related to the Transformer, LRNs are a drop-in replacement to other RNNs, which remove the sequential natural of RNN processing. It essentially uses a Key-Query-Value attention mechanism instead of the recurrence.
Theoretical Properties
- Weiss et al 2018 - On the Practical Computational Power of Finite Precision RNNs for Language Recognition Shows that RNN variants that can count are strictly more expressive than ones that cannot, and verifies this experimentally.
- Merrill et al 2020 - A Formal Hierarchy of RNN Architectures Interesting, but possibly incorrect in practice because the theoretical analysis is for saturated RNNs (related follow-up here).
People
Converting RNNs to WFSAs
Related Pages
nlp/rnn.txt · Last modified: 2023/07/30 18:14 by jmflanig