论文标题

迈向更好的选择性分类

Towards Better Selective Classification

论文作者

Feng, Leo, Ahmed, Mohamed Osama, Hajimirsadeghi, Hossein, Abdi, Amir

论文摘要

我们解决了选择性分类的问题,目的是在数据集的预定比率(覆盖率)上实现最佳性能。最近最新的选择性方法通过引入单独的选择头或额外的弃牌logit进行体系结构变化。在本文中,我们挑战上述方法。结果表明,最先进的方法的出色性能归功于培训更普遍的分类器,而不是其提议的选择机制。我们认为,最佳性能选择机制应该植根于分类器本身。我们提出的选择策略使用分类得分,并在所有覆盖范围和所有数据集中都能通过大幅度的边距实现更好的结果,而没有任何添加的计算成本。此外,受到半监督学习的启发,我们提出了一个基于熵的正规器,可改善选择性分类方法的性能。我们提出的拟议基于熵的正规器的建议选择机制可实现新的最新结果。

We tackle the problem of Selective Classification where the objective is to achieve the best performance on a predetermined ratio (coverage) of the dataset. Recent state-of-the-art selective methods come with architectural changes either via introducing a separate selection head or an extra abstention logit. In this paper, we challenge the aforementioned methods. The results suggest that the superior performance of state-of-the-art methods is owed to training a more generalizable classifier rather than their proposed selection mechanisms. We argue that the best performing selection mechanism should instead be rooted in the classifier itself. Our proposed selection strategy uses the classification scores and achieves better results by a significant margin, consistently, across all coverages and all datasets, without any added compute cost. Furthermore, inspired by semi-supervised learning, we propose an entropy-based regularizer that improves the performance of selective classification methods. Our proposed selection mechanism with the proposed entropy-based regularizer achieves new state-of-the-art results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源