Advertisement

Core-concavity, Gain Functions and Axioms for Information Leakage

  • Arthur Américo
  • M. H. R. Khouzani
  • Pasquale MalacariaEmail author
Chapter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11760)

Abstract

This work explores connections between core-concavity and gain functions, two alternative approaches that emerged in the quantitative information flow community to provide a general framework to study information leakage. In particular (1) we revisit “Axioms for Information Leakage” by replacing averaging with \(\eta \)-averaging and convexity with core-concavity. An interesting consequence of these changes is that the revised axioms capture all Rényi entropies, including the ones not captured by the original formulation of the axioms. (2) We provide an alternative proof for the Coriaceous Theorem based on core-concavity. The general approach of this work is more information theoretical in nature than the work based on gain functions and provides an alternative foundational view of quantitative information flow, rooted on the essential properties of entropy as a measure of uncertainty.

References

  1. 1.
    Alvim, M.S., Chatzikokolakis, K., McIver, A., Morgan, C., Palamidessi, C., Smith, G.: Axioms for information leakage. In: Proceedings of CSF, pp. 77–92 (2016).  https://doi.org/10.1109/CSF.2016.13
  2. 2.
    Alvim, M.S., Chatzikokolakis, K., Palamidessi, C., Smith, G.: Measuring information leakage using generalized gain functions. In: 2012 IEEE 25th Computer Security Foundations Symposium, pp. 265–279 (2012).  https://doi.org/10.1109/CSF.2012.26
  3. 3.
    Arimoto, S.: Information measures and capacity of order \(\alpha \) for discrete memoryless channels. Top. Inf. Theory (1977)Google Scholar
  4. 4.
    Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)CrossRefGoogle Scholar
  5. 5.
  6. 6.
    Havrda, J., Charvát, F.: Quantification method of classification processes: concept of structural \(\alpha \)-entropy. Kybernetika 3(1), 30–35 (1967)MathSciNetzbMATHGoogle Scholar
  7. 7.
    Hayashi, M.: Exponential decreasing rate of leaked information in universal random privacy amplification. IEEE Trans. Inf. Theory 57(6), 3989–4001 (2011)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Iwamoto, M., Shikata, J.: Information theoretic security for encryption based on conditional Rényi entropies. In: Padró, C. (ed.) Information Theoretic Security, pp. 103–121. Springer, Cham (2014)CrossRefGoogle Scholar
  9. 9.
    Khouzani, M.H.R., Malacaria, P.: Relative perfect secrecy: universally optimal strategies and channel design. In: IEEE 29th Computer Security Foundations Symposium, CSF 2016, Lisbon, Portugal, 27 June–1 July 2016, pp. 61–76 (2016).  https://doi.org/10.1109/CSF.2016.12
  10. 10.
    Marshall, A.W., Olkin, I., Arnold, B.C.: Inequalities: Theory of Majorization and Its Applications. Mathematics in Science and Engineering, vol. 143. Academic Press, Cambridge (1979)zbMATHGoogle Scholar
  11. 11.
    Massey: Guessing and entropy. In: Proceedings of the IEEE International Symposium on Information Theory, p. 204. IEEE (1994)Google Scholar
  12. 12.
    McIver, A., Morgan, C., Smith, G., Espinoza, B., Meinicke, L.: Abstract channels and their robust information-leakage ordering. In: Abadi, M., Kremer, S. (eds.) POST 2014. LNCS, vol. 8414, pp. 83–102. Springer, Heidelberg (2014).  https://doi.org/10.1007/978-3-642-54792-8_5CrossRefGoogle Scholar
  13. 13.
    Rényi, A.: On measures of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics, and Probability, pp. 547–561 (1961)Google Scholar
  14. 14.
    Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 625–656 (1948)MathSciNetGoogle Scholar
  15. 15.
    Sharma, B.D., Mittal, D.P.: New non-additive measures of entropy for discrete probability distributions. J. Math. Sci. (Soc. Math. Sci. Delhi India) 10, 28–40 (1975)Google Scholar
  16. 16.
    Smith, G.: On the foundations of quantitative information flow. In: de Alfaro, L. (ed.) FoSSaCS 2009. LNCS, vol. 5504, pp. 288–302. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-00596-1_21CrossRefGoogle Scholar
  17. 17.
    Tsallis, C.: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52(1–2), 479–487 (1988)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Arthur Américo
    • 1
  • M. H. R. Khouzani
    • 1
  • Pasquale Malacaria
    • 1
    Email author
  1. 1.School of Electronic Engineering and Computer ScienceQueen Mary University of LondonLondonUK

Personalised recommendations