论文标题
隐私保护凸优化:当差异隐私符合随机编程时
Privacy-Preserving Convex Optimization: When Differential Privacy Meets Stochastic Programming
论文作者
论文摘要
凸优化找到了许多现实生活应用程序,在这种应用程序上 - 在实际数据上进行了优化 - 优化结果可能会暴露出私人数据属性(例如,个人健康记录,商业信息),从而导致隐私漏洞。为了避免这些违规行为并正式保证优化数据所有者的隐私,我们通过结合随机(偶然的)编程和不同的隐私来开发一种新的隐私保护扰动策略,以用于凸优化程序。与标准的噪声添加策略(扰动优化数据或优化结果)不同,我们使用线性决策规则表示优化变量作为随机扰动的函数;然后,我们优化这些规则,以通过强制限制机会来适应问题可行区域内的扰动。这样,扰动是可行的,并且在给定距离函数的意义上,在随机优化结果中统计地相似,从而使其不同,但相邻,从而实现了概率的差异隐私保证。偶然受限的优化还将条件价值的衡量标准内化,以模拟对最佳损失W.R.T.最坏情况下实现的公差。非私有解决方案。我们通过优化和机器学习应用程序分析扰动策略的隐私属性。
Convex optimization finds many real-life applications, where--optimized on real data--optimization results may expose private data attributes (e.g., individual health records, commercial information), thus leading to privacy breaches. To avoid these breaches and formally guarantee privacy to optimization data owners, we develop a new privacy-preserving perturbation strategy for convex optimization programs by combining stochastic (chance-constrained) programming and differential privacy. Unlike standard noise-additive strategies, which perturb either optimization data or optimization results, we express the optimization variables as functions of the random perturbation using linear decision rules; we then optimize these rules to accommodate the perturbation within the problem's feasible region by enforcing chance constraints. This way, the perturbation is feasible and makes different, yet adjacent in the sense of a given distance function, optimization datasets statistically similar in randomized optimization results, thereby enabling probabilistic differential privacy guarantees. The chance-constrained optimization additionally internalizes the conditional value-at-risk measure to model the tolerance towards the worst-case realizations of the optimality loss w.r.t. the non-private solution. We demonstrate the privacy properties of our perturbation strategy analytically and through optimization and machine learning applications.