论文标题
有条件地生成临时事件序列
Conditional Generation of Temporally-ordered Event Sequences
论文作者
论文摘要
事实证明,叙事模式知识的模型对于一系列与事件相关的任务有用,但是它们通常不会捕获事件之间的时间关系。我们提出了一个单个模型,该模型既解决时间顺序,将给定的事件分类为它们发生的顺序,并填充事件,从而预测适合现有时间订购的序列的新事件。我们使用基于BART的条件生成模型,该模型可以捕获时间性和常见事件共发生,这意味着它可以灵活地应用于此空间中的不同任务。我们的模型被训练为denoising自动编码器:我们采用暂时的事件序列,洗牌,删除一些事件,然后尝试恢复原始事件序列。该任务教会模型在基本情况下对事件的不完整知识进行推断。在时间订购任务上,我们表明我们的模型能够从现有数据集中失去cramble事件序列,而无需访问明确标记的时间训练数据,从而优于基于BERT的成对模型和基于BERT的指针网络。在活动填充时,人类评估表明,与GPT-2故事完成模型相比,我们的模型能够生成更适合输入事件的事件。
Models of narrative schema knowledge have proven useful for a range of event-related tasks, but they typically do not capture the temporal relationships between events. We propose a single model that addresses both temporal ordering, sorting given events into the order they occurred, and event infilling, predicting new events which fit into an existing temporally-ordered sequence. We use a BART-based conditional generation model that can capture both temporality and common event co-occurrence, meaning it can be flexibly applied to different tasks in this space. Our model is trained as a denoising autoencoder: we take temporally-ordered event sequences, shuffle them, delete some events, and then attempt to recover the original event sequence. This task teaches the model to make inferences given incomplete knowledge about the events in an underlying scenario. On the temporal ordering task, we show that our model is able to unscramble event sequences from existing datasets without access to explicitly labeled temporal training data, outperforming both a BERT-based pairwise model and a BERT-based pointer network. On event infilling, human evaluation shows that our model is able to generate events that fit better temporally into the input events when compared to GPT-2 story completion models.