论文标题

umberto-mtsa @ age-it:通过对自我监督注释的多任务学习提高复杂性和可接受性预测

UmBERTo-MTSA @ AcCompl-It: Improving Complexity and Acceptability Prediction with Multi-task Learning on Self-Supervised Annotations

论文作者

Sarti, Gabriele

论文摘要

这项工作描述了一种自我监督的数据增强方法,用于改善学习模型的性能,仅当可用的标记数据可用时。原始模型的多个副本最初是在下游任务上训练的。然后,他们的预测被用来注释大量未标记的示例。最后,对由此产生的训练集的并行注释进行了多任务训练,并通过平均特定于注释者的头部预测获得最终得分。神经语言模型在2020年的完成任务的背景下使用此过程进行微调,从而获得了可观的预测质量。

This work describes a self-supervised data augmentation approach used to improve learning models' performances when only a moderate amount of labeled data is available. Multiple copies of the original model are initially trained on the downstream task. Their predictions are then used to annotate a large set of unlabeled examples. Finally, multi-task training is performed on the parallel annotations of the resulting training set, and final scores are obtained by averaging annotator-specific head predictions. Neural language models are fine-tuned using this procedure in the context of the AcCompl-it shared task at EVALITA 2020, obtaining considerable improvements in prediction quality.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源