论文标题

重新访问具有差异隐私的超参数调整

Revisiting Hyperparameter Tuning with Differential Privacy

论文作者

Ding, Youlong, Wu, Xueyang

论文摘要

高参数调整是机器学习应用中的一种常见实践,但由于其对整体隐私参数的负面影响,因此在文献中通常忽略了有关隐私机器学习的方面。在本文中,我们旨在通过提供具有不同隐私的有效的超参数调谐框架来解决这个基本而又具有挑战性的问题。提出的方法使我们能够采用更广泛的超参数搜索空间,甚至可以在整个空间上执行网格搜索,因为其隐私损失参数独立于候选超参数的数量。有趣的是,它与从参数搜索中获得的效用相关,从而揭示了隐私与公用事业之间的明确和强制性权衡。从理论上讲,我们证明了由高参数调整所产生的额外隐私损失限制在获得的实用程序的平方根上。但是,我们注意到,额外的隐私损失限制将在经验上像实用程序对数的平方根一样扩展,从而受益于加倍步骤的设计。

Hyperparameter tuning is a common practice in the application of machine learning but is a typically ignored aspect in the literature on privacy-preserving machine learning due to its negative effect on the overall privacy parameter. In this paper, we aim to tackle this fundamental yet challenging problem by providing an effective hyperparameter tuning framework with differential privacy. The proposed method allows us to adopt a broader hyperparameter search space and even to perform a grid search over the whole space, since its privacy loss parameter is independent of the number of hyperparameter candidates. Interestingly, it instead correlates with the utility gained from hyperparameter searching, revealing an explicit and mandatory trade-off between privacy and utility. Theoretically, we show that its additional privacy loss bound incurred by hyperparameter tuning is upper-bounded by the squared root of the gained utility. However, we note that the additional privacy loss bound would empirically scale like a squared root of the logarithm of the utility term, benefiting from the design of doubling step.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源