Table of Contents
Recurrent Neural Networks
RNN Variants
Theoretical Properties
Converting RNNs to WFSAs
Related Pages
Recurrent Neural Networks
RNN Variants
Zhang et al 2019 - A Lightweight Recurrent Network for Sequence Modeling
Related to the Transformer, LRNs are a drop-in replacement to other RNNs, which remove the sequential natural of RNN processing. It essentially uses a Key-Query-Value attention mechanism instead of the recurrence.
Theoretical Properties
Weiss et al 2018 - On the Practical Computational Power of Finite Precision RNNs for Language Recognition
Shows that RNN variants that can count are strictly more expressive than ones that cannot, and verifies this experimentally.
Merrill et al 2020 - A Formal Hierarchy of RNN Architectures
Interesting, but possibly incorrect in practice because the theoretical analysis is for saturated RNNs (related follow-up
here
).
Hewitt et al 2020 - RNNs can generate bounded hierarchical languages with optimal memory
People
Michael Hahn
Converting RNNs to WFSAs
See
Converting RNNs to WFSAs
Related Pages
Seq2seq
State-Space Models
WFSA