论文标题

对会话AI的口语理解:最近的进步和未来的方向

Spoken Language Understanding for Conversational AI: Recent Advances and Future Direction

论文作者

Han, Soyeon Caren, Long, Siqu, Weld, Henry, Poon, Josiah

论文摘要

当人类在网络和在线上使用自然语言与机器进行通信时,它如何理解人类谈话的意图和语义背景?这是一项重要的AI任务,因为它使机器能够构建明智的答案或为人类执行有用的动作。含义在句子级别表示,其识别称为意图检测,在单词级别,标记任务称为插槽填充。这项双重级别的联合任务需要对自然语言和深度学习网络设计的创新思考,结果,已经提出了许多方法和模型。 本教程将讨论如何建立联合任务,并通过深度学习技巧引入口头语言理解/自然语言理解(SLU/NLU)。我们将介绍该现场使用的数据集,实验和指标。我们将描述机器如何使用最新的NLP和深度学习技术来解决联合任务,包括经常性和基于注意力的变压器网络以及预训练的模型(例如BERT)。然后,我们将详细介绍一个网络,该网络允许任务的两个级别分类和插槽填充,以明确提高性能。我们将为该模型进行Python笔记本的代码演示,与会者将有机会观看该联合NLU上的编码演示任务,以进一步了解他们的理解。

When a human communicates with a machine using natural language on the web and online, how can it understand the human's intention and semantic context of their talk? This is an important AI task as it enables the machine to construct a sensible answer or perform a useful action for the human. Meaning is represented at the sentence level, identification of which is known as intent detection, and at the word level, a labelling task called slot filling. This dual-level joint task requires innovative thinking about natural language and deep learning network design, and as a result, many approaches and models have been proposed and applied. This tutorial will discuss how the joint task is set up and introduce Spoken Language Understanding/Natural Language Understanding (SLU/NLU) with Deep Learning techniques. We will cover the datasets, experiments and metrics used in the field. We will describe how the machine uses the latest NLP and Deep Learning techniques to address the joint task, including recurrent and attention-based Transformer networks and pre-trained models (e.g. BERT). We will then look in detail at a network that allows the two levels of the task, intent classification and slot filling, to interact to boost performance explicitly. We will do a code demonstration of a Python notebook for this model and attendees will have an opportunity to watch coding demo tasks on this joint NLU to further their understanding.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源