DocumentCode :
3252827
Title :
Stochastic gradient descent with differentially private updates
Author :
Shuang Song ; Chaudhuri, Kamalika ; Sarwate, Anand D.
Author_Institution :
Dept. of Comput. Sci. & Eng., Univ. of California, San Diego, La Jolla, CA, USA
fYear :
2013
fDate :
3-5 Dec. 2013
Firstpage :
245
Lastpage :
248
Abstract :
Differential privacy is a recent framework for computation on sensitive data, which has shown considerable promise in the regime of large datasets. Stochastic gradient methods are a popular approach for learning in the data-rich regime because they are computationally tractable and scalable. In this paper, we derive differentially private versions of stochastic gradient descent, and test them empirically. Our results show that standard SGD experiences high variability due to differential privacy, but a moderate increase in the batch size can improve performance significantly.
Keywords :
data privacy; gradient methods; stochastic processes; SGD; batch size; differential privacy; learning; scalable data; stochastic gradient descent method; tractable data; Algorithm design and analysis; Data privacy; Linear programming; Logistics; Noise; Privacy; Signal processing algorithms;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Global Conference on Signal and Information Processing (GlobalSIP), 2013 IEEE
Conference_Location :
Austin, TX
Type :
conf
DOI :
10.1109/GlobalSIP.2013.6736861
Filename :
6736861
Link To Document :
بازگشت