论文标题
麦克伯特:与伯特(Bert
mcBERT: Momentum Contrastive Learning with BERT for Zero-Shot Slot Filling
论文作者
论文摘要
零射击插槽填充物已对应对目标域的可用数据有限的问题非常关注。零射门学习的重要因素之一是使模型学习通用和可靠的表示形式。为此,我们向McBert提出了与Bert的动量对比学习的代表,以开发出强大的零击插槽填充模型。 McBert使用Bert来初始化两个编码器,即查询编码器和密钥编码器,并通过应用动量对比度学习来训练。我们在SNIPS基准测试的实验结果表明,麦克伯特大大胜过了先前的型号,记录了新的最新技术。此外,我们还表明,组成McBert的每个组件都有助于提高性能。
Zero-shot slot filling has received considerable attention to cope with the problem of limited available data for the target domain. One of the important factors in zero-shot learning is to make the model learn generalized and reliable representations. For this purpose, we present mcBERT, which stands for momentum contrastive learning with BERT, to develop a robust zero-shot slot filling model. mcBERT uses BERT to initialize the two encoders, the query encoder and key encoder, and is trained by applying momentum contrastive learning. Our experimental results on the SNIPS benchmark show that mcBERT substantially outperforms the previous models, recording a new state-of-the-art. Besides, we also show that each component composing mcBERT contributes to the performance improvement.