====== Dependency Parsing ====== See also [[http://nlpprogress.com/english/dependency_parsing.html|NLP Progress - Dependency Parsing]] ===== Graph-Based Dependency Parsing ===== ==== Early papers (pre-neural): ==== * The paper that started graph-based dependency parsing: [[https://www.aclweb.org/anthology/H05-1066.pdf|McDonald et al 2005 - Non-projective Dependency Parsing using Spanning Tree Algorithms]] * [[https://www.aclweb.org/anthology/P10-1001.pdf|Koo & Collins 2010 - Efficient Third-order Dependency Parsers]] ==== Neural graph-based dependency parsing ==== * [[https://aclanthology.org/P15-1031.pdf|Pei et al 2015 - An Effective Neural Network Model for Graph-based Dependency Parsing]] * [[https://www.aclweb.org/anthology/Q16-1023.pdf|Kiperwasser & Goldberg 2016 - Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations]] An early neural graph-based parser. One of the early papers to advocate using BiLSTMs as features in NLP models. * [[https://arxiv.org/pdf/1606.01280.pdf|Zhang et al 2016 - Dependency Parsing as Head Selection]] Applies MST and Eisner's algorithm to a neural network trained to predict the head of each word. * [[https://www.aclweb.org/anthology/P16-1218.pdf|Wang & Chang 2016 - Graph-based Dependency Parsing with Bidirectional LSTM]] Similar and concurrent (but slightly later) than Kiperwasser & Goldberg * [[https://arxiv.org/pdf/1611.01734.pdf|Dozat & Manning 2017 - Deep Biaffine Attention for Neural Dependency Parsing]] Popular neural graph-based parser, often used as a baseline model. * [[https://arxiv.org/pdf/1911.03875.pdf|Mrini et al 2019 - Rethinking Self-Attention: Towards Interpretability in Neural Parsing]] * [[https://arxiv.org/pdf/1807.01745.pdf|Gómez-Rodríguez et al 2018 - Global Transition-based Non-projective Dependency Parsing]] * **[[https://arxiv.org/pdf/2010.02550.pdf|Zmigrod et al 2020 - Please Mind the Root: Decoding Arborescences for Dependency Parsing]]** === Neural graph-based models with higher-order features === * [[https://www.aclweb.org/anthology/P19-1237.pdf|Ji et al 2019 - Graph-based Dependency Parsing with Graph Neural Networks]] * [[https://arxiv.org/pdf/2005.00975.pdf|Zhang et al 2020 - Efficient Second-Order TreeCRF for Neural Dependency Parsing]] ===== Transition-Based Dependency Parsing ===== * [[https://www.aclweb.org/anthology/D14-1082.pdf|Chen & Manning 2014 - A Fast and Accurate Dependency Parser using Neural Networks]] The first neural dependency parser. * [[https://arxiv.org/pdf/1805.01087.pdf|Ma et al 2018 - Stack-Pointer Networks for Dependency Parsing]] Has a good overview of existing parsers at the time. * [[https://arxiv.org/pdf/1804.06004.pdf|Keith et al 2018 - Monte Carlo Syntax Marginals for Exploring and Using Dependency Parses]] ===== Unsupervised Dependency Parsing ====== * **Overviews** * Concise summary of prior work in [[https://www.aclweb.org/anthology/2020.tacl-1.15.pdf|Nishida 2020]]. * [[https://arxiv.org/pdf/2010.01535.pdf|Han et al 2020 - A Survey of Unsupervised Dependency Parsing]] * **Key papers** * [[https://www.aclweb.org/anthology/P04-1061.pdf|Klein & Manning 2001 - Corpus-Based Induction of Syntactic Structure: Models of Dependency and Constituency]] Dependency Model with Valence (DMV). What made Dan Klein famous * [[https://jscholarship.library.jhu.edu/bitstream/handle/1774.2/938/smith.2sp.thesis06.pdf?sequence=1&isAllowed=y|Noah Smith 2006 - Novel Estimation Methods for Unsupervised Discovery of Latent Structure in Natural Language Text]] What made Noah Smith famous * [[https://www.aclweb.org/anthology/N09-1009.pdf|Cohen & Smith 2009 - Shared Logistic Normal Distributions for Soft Parameter Tying in Unsupervised Grammar Induction]] What made Shay Cohen famous ===== Software ===== * [[https://stanfordnlp.github.io/stanza/|Stanza]] This parser is very good, better than Stanford Core NLP, NLTK, or other tools ===== People ===== * [[https://scholar.google.com/citations?user=faXAgZQAAAAJ&hl=en|Zhenghua Li]] ===== Related Pages ===== * [[Constituency Parsing]] * [[Semantic Dependencies]] (Semantic Dependency Parsing)