论文标题

弱监督语义分割的本地化扩展和收缩

Expansion and Shrinkage of Localization for Weakly-Supervised Semantic Segmentation

论文作者

Li, Jinlong, Jie, Zequn, Wang, Xu, Wei, Xiaolin, Ma, Lin

论文摘要

生成精确的类感知的伪基真正的伪造,又称类激活图(CAM)对于弱监督的语义分割至关重要。原始CAM方法通常会产生不完整和不准确的定位图。为了解决这个问题,本文提出了基于可变形卷积中偏移学习的扩展和收缩方案,以在两个阶段中依次提高定位对象的召回和精度。在扩展阶段,被称为“扩展采样器”的可变形卷积层中的偏移学习分支寻求采样越来越小的区分对象区域,这是由逆监督信号驱动的,该信号最大化了图像级分类损失。然后在收缩阶段逐渐将位置更完整的物体逐渐缩小到最终对象区域。在收缩阶段,引入了另一个可变形卷积层的偏移学习分支,称为“收缩采样器”,以排除在扩展阶段参加的假积极背景区域,以提高定位图的精度。我们在Pascal VOC 2012和MS Coco 2014上进行了各种实验,以很好地证明了我们方法比其他最先进的方法对弱监督语义分割的优越性。代码将在此处公开提供,https://github.com/tyroneli/esol_wsss。

Generating precise class-aware pseudo ground-truths, a.k.a, class activation maps (CAMs), is essential for weakly-supervised semantic segmentation. The original CAM method usually produces incomplete and inaccurate localization maps. To tackle with this issue, this paper proposes an Expansion and Shrinkage scheme based on the offset learning in the deformable convolution, to sequentially improve the recall and precision of the located object in the two respective stages. In the Expansion stage, an offset learning branch in a deformable convolution layer, referred as "expansion sampler" seeks for sampling increasingly less discriminative object regions, driven by an inverse supervision signal that maximizes image-level classification loss. The located more complete object in the Expansion stage is then gradually narrowed down to the final object region during the Shrinkage stage. In the Shrinkage stage, the offset learning branch of another deformable convolution layer, referred as "shrinkage sampler", is introduced to exclude the false positive background regions attended in the Expansion stage to improve the precision of the localization maps. We conduct various experiments on PASCAL VOC 2012 and MS COCO 2014 to well demonstrate the superiority of our method over other state-of-the-art methods for weakly-supervised semantic segmentation. Code will be made publicly available here https://github.com/TyroneLi/ESOL_WSSS.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源