DocumentCode
6787
Title
Incentive Compatible Privacy-Preserving Data Analysis
Author
Kantarcioglu, Murat ; Wei Jiang
Author_Institution
Jonsson Sch. of Eng. & Comput. Sci., Univ. of Texas at Dallas, Richardson, TX, USA
Volume
25
Issue
6
fYear
2013
fDate
Jun-13
Firstpage
1323
Lastpage
1335
Abstract
In many cases, competing parties who have private data may collaboratively conduct privacy-preserving distributed data analysis (PPDA) tasks to learn beneficial data models or analysis results. Most often, the competing parties have different incentives. Although certain PPDA techniques guarantee that nothing other than the final analysis result is revealed, it is impossible to verify whether participating parties are truthful about their private input data. Unless proper incentives are set, current PPDA techniques cannot prevent participating parties from modifying their private inputs.incentive compatible privacy-preserving data analysis techniques This raises the question of how to design incentive compatible privacy-preserving data analysis techniques that motivate participating parties to provide truthful inputs. In this paper, we first develop key theorems, then base on these theorems, we analyze certain important privacy-preserving data analysis tasks that could be conducted in a way that telling the truth is the best choice for any participating party.
Keywords
data analysis; data models; data privacy; distributed processing; learning (artificial intelligence); PPDA techniques; data model; incentive compatible privacy-preserving data analysis techniques; learning; privacy-preserving distributed data analysis; Analytical models; Companies; Computational modeling; Data analysis; Data mining; Data models; Protocols; Privacy; noncooperative computation; secure multiparty computation;
fLanguage
English
Journal_Title
Knowledge and Data Engineering, IEEE Transactions on
Publisher
ieee
ISSN
1041-4347
Type
jour
DOI
10.1109/TKDE.2012.61
Filename
6171190
Link To Document