论文标题
预处理的变压器,用于简单的问题,回答知识图
Pretrained Transformers for Simple Question Answering over Knowledge Graphs
论文作者
论文摘要
通过知识图回答简单的问题是一个有理由的问题。此任务的先前方法建立在使用经过验证的单词嵌入的基于经常性和卷积神经网络的架构上。最近表明,经过验证的经过验证的变压器网络(例如BERT)可以在各种自然语言处理任务上胜过以前的方法。在这项工作中,我们研究了BERT在简单问题上的表现,并在DatasParse方案中对基于BERT和BILSTM的模型进行了评估。
Answering simple questions over knowledge graphs is a well-studied problem in question answering. Previous approaches for this task built on recurrent and convolutional neural network based architectures that use pretrained word embeddings. It was recently shown that finetuning pretrained transformer networks (e.g. BERT) can outperform previous approaches on various natural language processing tasks. In this work, we investigate how well BERT performs on SimpleQuestions and provide an evaluation of both BERT and BiLSTM-based models in datasparse scenarios.