论文标题

在嘈杂的环境下适应准确,健壮的域

Towards Accurate and Robust Domain Adaptation under Noisy Environments

论文作者

Han, Zhongyi, Gui, Xian-Jin, Cui, Chaoran, Yin, Yilong

论文摘要

在非平稳环境中,学习机通常面对域适应方案,数据分布确实会随着时间而变化。以前的领域适应性作品在理论和实践中取得了巨大的成功。但是,它们总是在嘈杂的环境中失去鲁棒性,在这些环境中,来自源域的示例的标签和特征被损坏。在本文中,我们报告了我们尝试实现准确的噪声域适应性的尝试。我们首先提供理论分析,揭示有害噪声如何影响无监督的领域适应。为了消除标签噪声的效果,我们提出了一个离线课程学习,以最大程度地降低新定义的经验源风险。为了减少特征噪声的影响,我们提出了基于代理分布的边缘差异。我们将方法无缝地转换为对对抗性网络,该网络对它们进行有效的关节优化,从而成功地减轻了数据损坏和分布转移的负面影响。一系列实证研究表明,我们的算法表现明显胜过最终的最新状态,在嘈杂环境下某些域适应任务的准确性超过10%。

In non-stationary environments, learning machines usually confront the domain adaptation scenario where the data distribution does change over time. Previous domain adaptation works have achieved great success in theory and practice. However, they always lose robustness in noisy environments where the labels and features of examples from the source domain become corrupted. In this paper, we report our attempt towards achieving accurate noise-robust domain adaptation. We first give a theoretical analysis that reveals how harmful noises influence unsupervised domain adaptation. To eliminate the effect of label noise, we propose an offline curriculum learning for minimizing a newly-defined empirical source risk. To reduce the impact of feature noise, we propose a proxy distribution based margin discrepancy. We seamlessly transform our methods into an adversarial network that performs efficient joint optimization for them, successfully mitigating the negative influence from both data corruption and distribution shift. A series of empirical studies show that our algorithm remarkably outperforms state of the art, over 10% accuracy improvements in some domain adaptation tasks under noisy environments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源