论文标题
对比度训练以改进分布式检测
Contrastive Training for Improved Out-of-Distribution Detection
论文作者
论文摘要
越来越多地认为,可靠检测分布(OOD)输入是部署机器学习系统的前提。本文提出并研究了使用对比训练来提高OOD检测性能。与领先的OOD检测方法不同,我们的方法不需要访问明确标记为OOD的示例,这在实践中很难收集。我们在广泛的实验中表明,对比训练可显着有助于在许多常见基准测试上进行OOD检测性能。通过引入和采用混乱日志概率(CLP)分数,通过捕获Inlier和Outlier数据集的相似性来量化OOD检测任务的难度,我们表明我们的方法尤其改善了“接近OOD”类中的性能 - 对于以前的方法特别具有挑战性。
Reliable detection of out-of-distribution (OOD) inputs is increasingly understood to be a precondition for deployment of machine learning systems. This paper proposes and investigates the use of contrastive training to boost OOD detection performance. Unlike leading methods for OOD detection, our approach does not require access to examples labeled explicitly as OOD, which can be difficult to collect in practice. We show in extensive experiments that contrastive training significantly helps OOD detection performance on a number of common benchmarks. By introducing and employing the Confusion Log Probability (CLP) score, which quantifies the difficulty of the OOD detection task by capturing the similarity of inlier and outlier datasets, we show that our method especially improves performance in the `near OOD' classes -- a particularly challenging setting for previous methods.