Title :
Privacy-Preserving Gradient-Descent Methods
Author :
Han, Shuguo ; Ng, Wee Keong ; Wan, Li ; Lee, Vincent C S
Author_Institution :
Centre for Adv. Inf. Syst. (CAIS), Nanyang Technol. Univ., Singapore, Singapore
fDate :
6/1/2010 12:00:00 AM
Abstract :
Gradient descent is a widely used paradigm for solving many optimization problems. Gradient descent aims to minimize a target function in order to reach a local minimum. In machine learning or data mining, this function corresponds to a decision model that is to be discovered. In this paper, we propose a preliminary formulation of gradient descent with data privacy preservation. We present two approaches-stochastic approach and least square approach-under different assumptions. Four protocols are proposed for the two approaches incorporating various secure building blocks for both horizontally and vertically partitioned data. We conduct experiments to evaluate the scalability of the proposed secure building blocks and the accuracy and efficiency of the protocols for four different scenarios. The excremental results show that the proposed secure building blocks are reasonably scalable and the proposed protocols allow us to determine a better secure protocol for the applications for each scenario.
Keywords :
data mining; data privacy; gradient methods; learning (artificial intelligence); optimisation; data mining; data privacy preservation; decision model; least square approach; machine learning; optimization problems; privacy-preserving gradient-descent methods; stochastic approach; Privacy-preserving data mining; gradient-descent method; least square approach.; secure multiparty computation; stochastic approach;
Journal_Title :
Knowledge and Data Engineering, IEEE Transactions on
DOI :
10.1109/TKDE.2009.153