Perturbation Paradigms of Maintaining Privacy-Preserving Monotonicity for Differential Privacy

  • Hai Liu
  • Zhenqiang Wu
  • Changgen Peng
  • Shuangyue Zhang
  • Feng Tian
  • Laifeng Lu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10631)

Abstract

To preserve confidential information for numeric and character data, there are corresponding to differential privacy mechanisms. However, current work without uniform evaluation criterion for these differential privacy mechanisms, because the data types are different. In this paper, we proposed privacy-preserving monotonicity principle as an evaluation criterion of differential privacy mechanisms. Firstly, this paper summarized three perturbation paradigms of existing work, including the linear perturbation, non-linear perturbation, and randomized perturbation. Secondly, for numeric and character data, we proposed privacy-preserving monotonicity principle of differential privacy based on computational indistinguishability, respectively. Finally, through analysis privacy-preserving monotonicity of existing perturbation methods for each perturbation paradigm, we presented constrained perturbation paradigms for numeric and character data that can achieve privacy-preserving monotonicity. Therefore, our privacy-preserving monotonicity principle shows the tradeoff between privacy and utility, and it can be regarded as an evaluation criterion of differential privacy mechanisms. Furthermore, we show that constrained perturbation paradigms of maintaining privacy-preserving monotonicity provide a useful guideline for differential privacy development.

Keywords

Computational indistinguishability Differential privacy Perturbation paradigms Privacy metrics Privacy-preserving monotonicity 

Notes

Acknowledgments

This work was supported by the National Natural Science Foundation of China with No. 61173190, No. 61602290, and No. 61662009, Natural Science Basic Research Program of Shaanxi Province with No. 2017JQ6038, Fundamental Research Founds for the Central Universities with No. GK201704016, No. 2016CBY004, No. GK201603093, No. GK201501008, and No. GK201402004, Program of Key Science and Technology Innovation Team in Shaanxi Province with No. 2014KTC-18, and Open Project Fund of Guizhou Provincial Key Laboratory of Public Big Data with No. 2017BDKFJJ026.

References

  1. 1.
    Dwork, C., McSherry, F., Nissim, K., Smith, A.: Calibrating noise to sensitivity in private data analysis. In: Halevi, S., Rabin, T. (eds.) TCC 2006. LNCS, vol. 3876, pp. 265–284. Springer, Heidelberg (2006).  https://doi.org/10.1007/11681878_14CrossRefGoogle Scholar
  2. 2.
    Dwork, C., Kenthapadi, K., McSherry, F., Mironov, I., Naor, M.: Our data, ourselves: privacy via distributed noise generation. In: Vaudenay, S. (ed.) EUROCRYPT 2006. LNCS, vol. 4004, pp. 486–503. Springer, Heidelberg (2006).  https://doi.org/10.1007/11761679_29CrossRefGoogle Scholar
  3. 3.
    McSherry, F., Talwar, K.: Mechanism design via differential privacy. In: 48th Annual IEEE Symposium on Foundations of Computer Science, pp. 94–103. IEEE (2007)Google Scholar
  4. 4.
    Warner, S.L.: Randomized response: a survey technique for eliminating evasive answer bias. J. Am. Stat. Assoc. 60(309), 63–69 (1965)CrossRefGoogle Scholar
  5. 5.
    Katz, J., Lindell, Y.: Introduction to Modern Cryptography. CRC Press, Boca Raton (2014)MATHGoogle Scholar
  6. 6.
    Dwork, C., Roth, A.: The algorithmic foundations of differential privacy. Found. Trends® Theor. Comput. Sci. 9(3–4), 211–407 (2014)MathSciNetMATHGoogle Scholar
  7. 7.
    Yang, B., Sato, I., Nakagawa, H.: Bayesian differential privacy on correlated data. In: Proceedings of the 2015 ACM SIGMOD International Conference on Management of Data, pp. 747–762. ACM (2015)Google Scholar
  8. 8.
    Abadi, M., Chu, A., Goodfellow, I., McMahan, H. B., Mironov, I., Talwar, K., Zhang, L.: Deep learning with differential privacy. In: Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, pp. 308–318. ACM (2016)Google Scholar
  9. 9.
    Tong, W., Hua, J., Zhong, S.: A jointly differentially private scheduling protocol for ridesharing services. IEEE Trans. Inf. Forensics Secur. 12(10), 2444–2456 (2017)CrossRefGoogle Scholar
  10. 10.
    Mitrovic, M., Bun, M., Krause, A., Karbasi, A.: Differentially private submodular maximization: data summarization in disguise. In: International Conference on Machine Learning, pp. 2478–2487. ACM (2017)Google Scholar
  11. 11.
    Qin, Z., Yang, Y., Yu, T., Khalil, I., Xiao, X., Ren, K.: Heavy hitter estimation over set-valued data with local differential privacy. In: Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, pp. 192–203. ACM (2016)Google Scholar
  12. 12.
    Fanaee-T, H., Gama, J.: Event labeling combining ensemble detectors and background knowledge. Prog. Artif. Intell. 2(2–3), 113–127 (2014)CrossRefGoogle Scholar
  13. 13.
    Lipowski, A., Lipowska, D.: Roulette-wheel selection via stochastic acceptance. Phys. A: Stat. Mech. Appl. 391(6), 2193–2196 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Hai Liu
    • 1
  • Zhenqiang Wu
    • 1
  • Changgen Peng
    • 2
  • Shuangyue Zhang
    • 1
  • Feng Tian
    • 1
  • Laifeng Lu
    • 3
  1. 1.School of Computer ScienceShaanxi Normal UniversityXi’anChina
  2. 2.Guizhou Provincial Key Laboratory of Public Big DataGuizhou UniversityGuiyangChina
  3. 3.School of Mathematics and Information ScienceShaanxi Normal UniversityXi’anChina

Personalised recommendations