Graves 2013 - Generating Sequences With Recurrent Neural Networks Uses an alignment mechanism for handwriting generation, similar to the attention mechanism. The
Deep Learning Book p. 415 at the end of Ch 10 says “The idea of attention mechanisms for neural networks was introduced even earlier, in the context of handwriting generation (Graves, 2013), with an attention mechanism that was constrained to move only forward in time through the sequence.”