论文标题

对实体提及的针对增强文档级别关系提取的特定于关系的注意力

Relation-Specific Attentions over Entity Mentions for Enhanced Document-Level Relation Extraction

论文作者

Yu, Jiaxin, Yang, Deqing, Tian, Shuyu

论文摘要

与传统的句子级别的关系提取相比,文档级关系提取是一项更具挑战性的任务,可以多次提及文档中的实体并与多个关系相关联。但是,大多数文档级别关系提取的方法都不能区分提及级别的功能和实体级别的功能,而只需将简单的汇总操作应用于将提及级别的功能汇总到实体级特征中即可。结果,忽略了实体不同提及之间的独特语义。为了解决这个问题,我们在本文中提出了RSMAN,该论文对不同实体提到的有关候选关系的选择性观察。通过这种方式,获得了实体的灵活和特定于关系的表示,这确实有益于关系分类。我们对两个基准数据集进行的广泛实验表明,我们的RSMAN可以为某些骨干模型带来重大改进,以实现最新的性能,尤其是当实体在文档中有多次提及时。

Compared with traditional sentence-level relation extraction, document-level relation extraction is a more challenging task where an entity in a document may be mentioned multiple times and associated with multiple relations. However, most methods of document-level relation extraction do not distinguish between mention-level features and entity-level features, and just apply simple pooling operation for aggregating mention-level features into entity-level features. As a result, the distinct semantics between the different mentions of an entity are overlooked. To address this problem, we propose RSMAN in this paper which performs selective attentions over different entity mentions with respect to candidate relations. In this manner, the flexible and relation-specific representations of entities are obtained which indeed benefit relation classification. Our extensive experiments upon two benchmark datasets show that our RSMAN can bring significant improvements for some backbone models to achieve state-of-the-art performance, especially when an entity have multiple mentions in the document.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源