Skip to main content

Introduction

  • Chapter
  • First Online:
  • 2949 Accesses

Part of the book series: Advances in Information Security ((ADIS,volume 69))

Abstract

Over the past two decades, digital information collected by corporations, organizations and governments has resulted in huge number of datasets, and the speed of such data collection has increased dramatically over the last a few years. However, most of the collected datasets are personally related and contain private or sensitive information. Differential privacy is a solid privacy model that provides a provable privacy guarantee for individuals. Differential privacy theoretically proves that there is a low probability of the adversary figuring out the unknown record. Compared to the previous privacy models, differential privacy can successfully resist background attack and provide a provable privacy guarantee.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Differential privacy for everyone. download.microsoft.com/download/D/1/F/D1F0DFF5-8BA9-4BDF-8924-7816932F6825/Differential_Privacy_for_Everyone.pdf, 2012.

    Google Scholar 

  2. Bose sued over alleged privacy breach. https://www.itnews.com.au, 2017.

  3. Telstra breaches privacy of thousands of customers. http://www.smh.com.au/it-pro/security-it, 2017.

  4. Yahoo says 500 million accounts stolen. http://money.cnn.com/2016/09/22/technology/yahoo-data-breach/, 2017.

  5. C. C. Aggarwal and P. S. Yu, editors. Privacy-Preserving Data Mining - Models and Algorithms, volume 34 of Advances in Database Systems. Springer, 2008.

    Google Scholar 

  6. G. Cormode, D. Srivastava, N. Li, and T. Li. Minimizing minimality and maximizing utility: analyzing method-based attacks on anonymized data. Proc. VLDB Endow., 3:1045–1056, September 2010.

    Google Scholar 

  7. Y.-A. De Montjoye, C. A. Hidalgo, M. Verleysen, and V. D. Blondel. Unique in the crowd: The privacy bounds of human mobility. Scientific reports, 3:1376, 2013.

    Article  Google Scholar 

  8. C. Dwork. Differential privacy. In ICALP, pages 1–12, 2006.

    Google Scholar 

  9. C. Dwork. Differential privacy: a survey of results. In TAMC’08, pages 1–19, 2008.

    Google Scholar 

  10. C. Dwork. Differential privacy in new settings. In SODA ’10, pages 174–183, Philadelphia, PA, USA, 2010. Society for Industrial and Applied Mathematics.

    Google Scholar 

  11. C. Dwork. A firm foundation for private data analysis. Commun. ACM, 54(1):86–95, 2011.

    Article  Google Scholar 

  12. C. Dwork and A. Roth. The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci., 9:211–407, Aug. 2014.

    Google Scholar 

  13. C. Dwork, A. Smith, T. Steinke, and J. Ullman. Exposed! a survey of attacks on private data. Annual Review of Statistics and Its Application, (0), 2017.

    Google Scholar 

  14. B. C. M. Fung, K. Wang, R. Chen, and P. S. Yu. Privacy-preserving data publishing: A survey of recent developments. ACM Comput. Surv., 42(4), 2010.

    Google Scholar 

  15. S. Ganta, S. Kasiviswanathan, and A. Smith. Composition attacks and auxiliary information in data privacy. pages 265–273, 2008.

    Google Scholar 

  16. X. Jin, N. Zhang, and G. Das. Algorithm-safe privacy-preserving data publishing. EDBT ’10, pages 633–644, New York, NY, USA, 2010. ACM.

    Google Scholar 

  17. D. Kifer. Attacks on privacy and DeFinetti’s theorem. SIGMOD ’09, pages 127–138, New York, NY, USA, 2009. ACM.

    Google Scholar 

  18. S. Le Blond, C. Zhang, A. Legout, K. Ross, and W. Dabbous. I know where you are and what you are sharing: exploiting p2p communications to invade users’ privacy. IMC ’11, pages 45–60, New York, NY, USA, 2011. ACM.

    Google Scholar 

  19. N. Li, T. Li, and S. Venkatasubramanian. t-closeness: Privacy beyond k-anonymity and l-diversity. pages 106 –115, April 2007.

    Google Scholar 

  20. N. Li, T. Li, and S. Venkatasubramanian. Closeness: A new privacy measure for data publishing. Knowledge and Data Engineering, IEEE Transactions on, 22(7):943–956, July 2010.

    Google Scholar 

  21. Y. Lindell and B. Pinkas. Privacy preserving data mining. pages 36–54, 2000.

    Google Scholar 

  22. A. Machanavajjhala, D. Kifer, J. Gehrke, and M. Venkitasubramaniam. L-diversity: Privacy beyond k-anonymity. ACM Trans. Knowl. Discov. Data, 1(1), Mar. 2007.

    Google Scholar 

  23. A. Narayanan and V. Shmatikov. How to break anonymity of the netflix prize dataset. CoRR, abs/cs/0610105, 2006.

    Google Scholar 

  24. A. Narayanan and V. Shmatikov. Robust de-anonymization of large sparse datasets. SP ’08, pages 111–125, Washington, DC, USA, 2008. IEEE Computer Society.

    Google Scholar 

  25. M. E. Nergiz, M. Atzori, and C. Clifton. Hiding the presence of individuals from shared databases. SIGMOD ’07, pages 665–676, New York, NY, USA, 2007. ACM.

    Google Scholar 

  26. P. Samarati and L. Sweeney. Generalizing data to provide anonymity when disclosing information. page 188, 1998. cited By (since 1996) 101.

    Google Scholar 

  27. A. D. Sarwate and K. Chaudhuri. Signal processing and machine learning with differential privacy: Algorithms and challenges for continuous data. IEEE Signal Processing Magazine, 30(5):86–94, 2013.

    Article  Google Scholar 

  28. M. Srivatsa and M. Hicks. Deanonymizing mobility traces: using social network as a side-channel. CCS ’12, pages 628–637, New York, NY, USA, 2012. ACM.

    Google Scholar 

  29. L. Sweeney. k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 10(5):557–570, 2002.

    Google Scholar 

  30. R. C.-W. Wong, A. W.-C. Fu, K. Wang, and J. Pei. Minimality attack in privacy preserving data publishing. VLDB ’07, pages 543–554. VLDB Endowment, 2007.

    Google Scholar 

  31. R. C.-W. Wong, A. W.-C. Fu, K. Wang, P. S. Yu, and J. Pei. Can the utility of anonymized data be used for privacy breaches? ACM Trans. Knowl. Discov. Data, 5(3):16:1–16:24, Aug. 2011.

    Google Scholar 

  32. X. Xiao, Y. Tao, and N. Koudas. Transparent anonymization: Thwarting adversaries who know the algorithm. ACM Trans. Database Syst., 35(2):8:1–8:48, May 2010.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Zhu, T., Li, G., Zhou, W., Yu, P.S. (2017). Introduction. In: Differential Privacy and Applications. Advances in Information Security, vol 69. Springer, Cham. https://doi.org/10.1007/978-3-319-62004-6_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-62004-6_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-62002-2

  • Online ISBN: 978-3-319-62004-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics