论文标题

稳定性和差异私有minimax问题的概括

Stability and Generalization of Differentially Private Minimax Problems

论文作者

Kang, Yilin, Liu, Yong, Li, Jian, Wang, Weiping

论文摘要

在机器学习领域,可以将许多问题作为最小问题提出,包括增强学习,生成对抗网络,仅举几例。因此,近几十年来,Minimax问题吸引了研究人员的大量关注。但是,在研究一般最小范式的隐私方面的工作相对较少。在本文中,我们关注一般最小值设置的隐私,将差异隐私与最小值优化范式结合在一起。此外,通过算法稳定性理论,我们理论上分析了在强大的convex-strong-conconcave条件下差异私有最小算法的高概率泛化性能。据我们所知,这是第一次考虑一般的minimax范式的概括性能,并考虑到差异隐私。

In the field of machine learning, many problems can be formulated as the minimax problem, including reinforcement learning, generative adversarial networks, to just name a few. So the minimax problem has attracted a huge amount of attentions from researchers in recent decades. However, there is relatively little work on studying the privacy of the general minimax paradigm. In this paper, we focus on the privacy of the general minimax setting, combining differential privacy together with minimax optimization paradigm. Besides, via algorithmic stability theory, we theoretically analyze the high probability generalization performance of the differentially private minimax algorithm under the strongly-convex-strongly-concave condition. To the best of our knowledge, this is the first time to analyze the generalization performance of general minimax paradigm, taking differential privacy into account.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源