论文标题

知识图图链接预测的检索和阅读框架预测

A Retrieve-and-Read Framework for Knowledge Graph Link Prediction

论文作者

Pahuja, Vardaan, Wang, Boshi, Latapie, Hugo, Srinivasa, Jayanth, Su, Yu

论文摘要

知识图(kg)链接预测旨在根据kg中现有事实推断新事实。最近的研究表明,与仅使用查询信息相比,通过图神经网络(GNN)使用节点的图形邻域提供了更多有用的信息。 kg链接预测的常规GNN遵循整个kg的标准消息范式,这导致了多余的计算,对节点表示的过度光滑,还限制了其表达能力。大规模的计算在整个kg中汇总有用的信息以进行推理变得昂贵。为了解决现有kg链接预测框架的局限性,我们提出了一个新颖的检索和阅读框架,该框架首先检索查询的相关子图上下文,然后在上下文中共同理由以及与高容量读取器的查询。作为我们新框架的示例实例化的一部分,我们提出了一种新型的基于变压器的GNN作为读者,该读者结合了基于图形的注意结构以及查询与上下文之间的深入融合。这种简单而有效的设计使该模型能够专注于与查询相关的显着上下文信息。两个标准KG链接预测数据集的经验结果证明了该方法的竞争性能。此外,我们的分析为在框架内设计改进的检索器设计提供了宝贵的见解。

Knowledge graph (KG) link prediction aims to infer new facts based on existing facts in the KG. Recent studies have shown that using the graph neighborhood of a node via graph neural networks (GNNs) provides more useful information compared to just using the query information. Conventional GNNs for KG link prediction follow the standard message-passing paradigm on the entire KG, which leads to superfluous computation, over-smoothing of node representations, and also limits their expressive power. On a large scale, it becomes computationally expensive to aggregate useful information from the entire KG for inference. To address the limitations of existing KG link prediction frameworks, we propose a novel retrieve-and-read framework, which first retrieves a relevant subgraph context for the query and then jointly reasons over the context and the query with a high-capacity reader. As part of our exemplar instantiation for the new framework, we propose a novel Transformer-based GNN as the reader, which incorporates graph-based attention structure and cross-attention between query and context for deep fusion. This simple yet effective design enables the model to focus on salient context information relevant to the query. Empirical results on two standard KG link prediction datasets demonstrate the competitive performance of the proposed method. Furthermore, our analysis yields valuable insights for designing improved retrievers within the framework.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源