====== Hypergraphs ====== Hypergraphs are a generalization of graphs, where the edges can connect any number of nodes (1 or more nodes). There can be directed and undirected edges in a hypergraph. In a directed edge, some of the edges are marked as the head, and the others are the tail. Hypergraphs have been used extensively in NLP for parsing or machine translation with grammars. ===== General Hypergraph Introductions ===== * [[https://en.wikipedia.org/wiki/Hypergraph|Wikipedia - Hypergraphs]] * [[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.54.829&rep=rep1&type=pdf|Gallo et al 1990 - Directed Hypergraphs and Applications]] ===== Papers ===== Search [[https://www.aclweb.org/anthology/|ACL Anthology]] for [[https://www.aclweb.org/anthology/search/?q=hypergraph|Hypergraphs]]. * [[https://www.aclweb.org/anthology/W01-1812.pdf|Klein & Manning 2001 - Parsing and Hypergraphs]] * [[https://www.aclweb.org/anthology/C08-5001.pdf|Huang 2008 - Advanced Dynamic Programming in Semiring and Hypergraph Frameworks]] * [[https://www.aclweb.org/anthology/N09-2003.pdf|Li & Khudanpur 2009 - Efficient Extraction of Oracle-best Translations from Hypergraphs]] * [[https://www.aclweb.org/anthology/D15-1102.pdf|Lu & Roth 2015 - Joint Mention Extraction and Classification with Mention Hypergraphs]] * [[https://www.aclweb.org/anthology/W16-3311.pdf|Bauer & Rambow 2016 - Hyperedge Replacement and Nonprojective Dependency Structures]] ===== Neural Papers ===== Neural papers that use hypergraphs. * [[https://www.aclweb.org/anthology/D18-1019.pdf|Wang & Lu 2018 - Neural Segmental Hypergraphs for Overlapping Mention Recognition]] EMNLP 2018 * [[https://www.aclweb.org/anthology/N18-1079.pdf|Katiyar & Cardie 2018 - Nested Named Entity Recognition Revisited]] * [[https://www.aclweb.org/anthology/2020.emnlp-main.399.pdf|Ding et al 2020 - Be More with Less: Hypergraph Attention Networks for Inductive Text Classification]] EMNLP 2020 * [[https://arxiv.org/pdf/2010.14439.pdf|Lin et al 2020 - Differentiable Open-Ended Commonsense Reasoning]] NAACL 2021 * From [[https://aclanthology.org/2020.findings-emnlp.133.pdf|this paper]]: "Hypergraph convolutional networks (Feng et al., 2019; Yadati et al., 2019) utilize hypergraph structure rather than a general graph to represent the high-order correlation among data entirely, and hypergraph attention (Bai et al., 2019) further enhances the ability of representation learning by using an attention module."