Yao et al 2019 - KG-BERT: BERT for Knowledge Graph Completion Says “most knowledge graph embedding models only use structure information in observed triple facts, which suffer from the sparseness of knowledge graphs. Some recent studies incorporate textual information to enrich knowledge representation (Socher et al. 2013; Xie et al. 2016; Xiao et al. 2017), but they learn unique text embedding for the same entity/relation in different triples, which ignore contextual information.”