Abstract
Over the past two decades, digital information collected by corporations, organizations and governments has resulted in huge number of datasets, and the speed of such data collection has increased dramatically over the last a few years. However, most of the collected datasets are personally related and contain private or sensitive information. Differential privacy is a solid privacy model that provides a provable privacy guarantee for individuals. Differential privacy theoretically proves that there is a low probability of the adversary figuring out the unknown record. Compared to the previous privacy models, differential privacy can successfully resist background attack and provide a provable privacy guarantee.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Differential privacy for everyone. download.microsoft.com/download/D/1/F/D1F0DFF5-8BA9-4BDF-8924-7816932F6825/Differential_Privacy_for_Everyone.pdf, 2012.
Bose sued over alleged privacy breach. https://www.itnews.com.au, 2017.
Telstra breaches privacy of thousands of customers. http://www.smh.com.au/it-pro/security-it, 2017.
Yahoo says 500 million accounts stolen. http://money.cnn.com/2016/09/22/technology/yahoo-data-breach/, 2017.
C. C. Aggarwal and P. S. Yu, editors. Privacy-Preserving Data Mining - Models and Algorithms, volume 34 of Advances in Database Systems. Springer, 2008.
G. Cormode, D. Srivastava, N. Li, and T. Li. Minimizing minimality and maximizing utility: analyzing method-based attacks on anonymized data. Proc. VLDB Endow., 3:1045–1056, September 2010.
Y.-A. De Montjoye, C. A. Hidalgo, M. Verleysen, and V. D. Blondel. Unique in the crowd: The privacy bounds of human mobility. Scientific reports, 3:1376, 2013.
C. Dwork. Differential privacy. In ICALP, pages 1–12, 2006.
C. Dwork. Differential privacy: a survey of results. In TAMC’08, pages 1–19, 2008.
C. Dwork. Differential privacy in new settings. In SODA ’10, pages 174–183, Philadelphia, PA, USA, 2010. Society for Industrial and Applied Mathematics.
C. Dwork. A firm foundation for private data analysis. Commun. ACM, 54(1):86–95, 2011.
C. Dwork and A. Roth. The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci., 9:211–407, Aug. 2014.
C. Dwork, A. Smith, T. Steinke, and J. Ullman. Exposed! a survey of attacks on private data. Annual Review of Statistics and Its Application, (0), 2017.
B. C. M. Fung, K. Wang, R. Chen, and P. S. Yu. Privacy-preserving data publishing: A survey of recent developments. ACM Comput. Surv., 42(4), 2010.
S. Ganta, S. Kasiviswanathan, and A. Smith. Composition attacks and auxiliary information in data privacy. pages 265–273, 2008.
X. Jin, N. Zhang, and G. Das. Algorithm-safe privacy-preserving data publishing. EDBT ’10, pages 633–644, New York, NY, USA, 2010. ACM.
D. Kifer. Attacks on privacy and DeFinetti’s theorem. SIGMOD ’09, pages 127–138, New York, NY, USA, 2009. ACM.
S. Le Blond, C. Zhang, A. Legout, K. Ross, and W. Dabbous. I know where you are and what you are sharing: exploiting p2p communications to invade users’ privacy. IMC ’11, pages 45–60, New York, NY, USA, 2011. ACM.
N. Li, T. Li, and S. Venkatasubramanian. t-closeness: Privacy beyond k-anonymity and l-diversity. pages 106 –115, April 2007.
N. Li, T. Li, and S. Venkatasubramanian. Closeness: A new privacy measure for data publishing. Knowledge and Data Engineering, IEEE Transactions on, 22(7):943–956, July 2010.
Y. Lindell and B. Pinkas. Privacy preserving data mining. pages 36–54, 2000.
A. Machanavajjhala, D. Kifer, J. Gehrke, and M. Venkitasubramaniam. L-diversity: Privacy beyond k-anonymity. ACM Trans. Knowl. Discov. Data, 1(1), Mar. 2007.
A. Narayanan and V. Shmatikov. How to break anonymity of the netflix prize dataset. CoRR, abs/cs/0610105, 2006.
A. Narayanan and V. Shmatikov. Robust de-anonymization of large sparse datasets. SP ’08, pages 111–125, Washington, DC, USA, 2008. IEEE Computer Society.
M. E. Nergiz, M. Atzori, and C. Clifton. Hiding the presence of individuals from shared databases. SIGMOD ’07, pages 665–676, New York, NY, USA, 2007. ACM.
P. Samarati and L. Sweeney. Generalizing data to provide anonymity when disclosing information. page 188, 1998. cited By (since 1996) 101.
A. D. Sarwate and K. Chaudhuri. Signal processing and machine learning with differential privacy: Algorithms and challenges for continuous data. IEEE Signal Processing Magazine, 30(5):86–94, 2013.
M. Srivatsa and M. Hicks. Deanonymizing mobility traces: using social network as a side-channel. CCS ’12, pages 628–637, New York, NY, USA, 2012. ACM.
L. Sweeney. k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 10(5):557–570, 2002.
R. C.-W. Wong, A. W.-C. Fu, K. Wang, and J. Pei. Minimality attack in privacy preserving data publishing. VLDB ’07, pages 543–554. VLDB Endowment, 2007.
R. C.-W. Wong, A. W.-C. Fu, K. Wang, P. S. Yu, and J. Pei. Can the utility of anonymized data be used for privacy breaches? ACM Trans. Knowl. Discov. Data, 5(3):16:1–16:24, Aug. 2011.
X. Xiao, Y. Tao, and N. Koudas. Transparent anonymization: Thwarting adversaries who know the algorithm. ACM Trans. Database Syst., 35(2):8:1–8:48, May 2010.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Zhu, T., Li, G., Zhou, W., Yu, P.S. (2017). Introduction. In: Differential Privacy and Applications. Advances in Information Security, vol 69. Springer, Cham. https://doi.org/10.1007/978-3-319-62004-6_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-62004-6_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-62002-2
Online ISBN: 978-3-319-62004-6
eBook Packages: Computer ScienceComputer Science (R0)