Skip to main content

A Demonic Lattice of Information

  • Chapter
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 10160))

Abstract

Landuaer and Redmond’s Lattice of Information was an early and influential formalisation of the pure structure of security [8]: a partial order was defined for information-flow from a hidden state. In modern terms we would say that more-security refines less-security. For Landauer, the deterministic case [op. cit.], the refinement order is a lattice.

Very recently [3, 9] a similar approach has been taken to purely probabilistic systems and there too a refinement order can be defined; but it is not a lattice [12].

In between deterministic and probabilistic is demonic, where behaviour is not deterministic but also not quantifiable. We show that our own earlier approach to this [15, 16] fits into the same pattern as deterministic and probabilistic, and illustrate that with results concerning compositionality, testing, soundness and completeness. Finally, we make some remarks about source-level reasoning.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    We use the feminine she/her consistently for adversaries. Plural we/us is used for the designers or users of programs, or the readers of this article; and neuter “it” or plural “they” is used for third parties.

  2. 2.

    Although both \(C^1\) and \(C^2\) distinguish A, W, their join cannot. Because \(C^2\) does not distinguish A, B, the join cannot; it can’t distinguish B, W because \(C^1\) does not: by transitivity therefore the join must regard A, W as equal. The same applies to E, W.

  3. 3.

    For matrix M indexed by rc we write \(M_{r,c}\) for the value in row r and column c.

  4. 4.

    We write SR for the matrix multiplication of S and R.

  5. 5.

    The mnemonics are S for Specification and P for an imPlementation (or Program) that is supposed to refine S, and K for a “Kludge” that, as an example, in fact does not refine S. In uncommitted cases, neither P nor K, we will use X.

  6. 6.

    In the probabilistic case Sect. 2, there would be infinitely many R’s to check: it would literally take forever.

  7. 7.

    The identity matrix is stochastic, and the product of two stochastic matrices is stochastic. Matrix columns are similar just when each is a multiple of the other. Column order can then be ignored by representing the matrix as a set of (the remaining) columns.

  8. 8.

    The completeness property was called the Coriaceous Conjecture in [3]. It was proved in [1, 9] and, it turns out, earlier by [4].

  9. 9.

    We continue to call them “cells”, as for partitions, in spite of the possible overlaps.

  10. 10.

    We write for matrices (of any element-type) with \(\mathcal{X}\)-indexed rows and \(\mathcal{Y}\)-indexed columns. For deterministic matrices is isomorphically functions \(\mathcal{I}\mathbin {\rightarrow }\mathcal{O}\).

  11. 11.

    For any subset I of \(\mathcal{I}\) we have \(\emptyset \,{\subseteq }\,I\) and so \(\emptyset = \mathbin {\cup }\emptyset \in I^{\mathbin {\cup }}\) also.

  12. 12.

    They are trivially sound, however, since weakening a test suite trivially preserves its soundness: with fewer tests, there will be fewer failures.

  13. 13.

    In fact \(i\in \iota \) is not necessary, since a pair \((i,\iota )\) with \(i\notin \iota \) would be a test passed by every cell, vacuously sound for all channels. Allowing it would make no difference.

  14. 14.

    Subsets of \(\mathcal{I}\), rather than individual elements, are more easily turned into predicates for source-level reasoning over a state space of typed variables: if you add another variable, a subset remains a subset but a point is no longer a point.

  15. 15.

    Non-trivial tests make at least one distinction. Tests \((\alpha ,\beta )\) are trivial when \(\alpha \subseteq \beta \) (passed by every cell), and when \(\alpha ,\beta \) are disjoint (passed only by cell \(\emptyset \).) In general \((\alpha ,\beta )\) is equivalent to \((\alpha ,\alpha \,{\mathbin {\cap }}\,\beta )\).

    Also for example \((\alpha ',\beta ')\) is weaker than \((\alpha ,\beta )\) when \(\alpha '\subseteq \alpha \) and \(\beta \subseteq \beta '\). Compare Footnote 22 below.

  16. 16.

    Just to be clear: a security breach releasing some large number N of passwords usually means in our terms that there are N singleton cells, not that there is just one cell with N passwords in it. The former means that each of N people has his password leaked exactly. The latter means instead that someone’s password is leearned to be one of those N.

  17. 17.

    Together they are an equivalence because \(S\,{\mathrel \sqsubseteq }\,P\) iff \((S{\mathbin {\parallel }}C)\mathrel {\preccurlyeq }(P{\mathbin {\parallel }}C)\) for all C.

  18. 18.

    It’s “can” rather than “must” because K is nondeterministic: it might not select cell \(\kappa \) for input k; but because \(k\in \kappa \), it can.

  19. 19.

    This is obviously by analogy with weakest preconditions [6].

  20. 20.

    Starting again, from conjunctions of post-tests, will just generate conjunctions of conjunctions of pre-tests, so we do not have to expand our expressiveness any further. Furthermore, every member of \({\mathbb U}\mathcal{I}\) is characterised uniquely by a conjunction of such tests: every conjunction of tests “is” union-closed (easy); and for every union-closed set there is a conjunction of tests that only it satisfies (sneaky).

  21. 21.

    In (9) here the \((\alpha ',\beta ')\) on the right is weaker than \((\alpha ,\beta )\) on the left because we have \(\alpha '\subseteq \alpha \) and \(\alpha '\mathbin {\cap }\beta \subseteq \beta '\). Compare Footnote 15 above.

  22. 22.

    We assume here that S is everywhere terminating.

References

  1. Alvim, M.S., Chatzikokolakis, K., McIver, A., Morgan, C., Palamidessi, C., Smith, G.: Additive and multiplicative notions of leakage, and their capacities. In: IEEE 27th CSF 2014, pp. 308–322. IEEE (2014)

    Google Scholar 

  2. Alvim, M.S., Chatzikokolakis, K., McIver, A., Morgan, C., Palamidessi, C., Smith, G.: Axioms for information leakage. Proc. CSF 2016, 77–92 (2016)

    Google Scholar 

  3. Alvim, M.S., Chatzikokolakis, K., Palamidessi, C., Smith, G.: Measuring information leakage using generalized gain functions. In: Proceedings of 25th IEEE (CSF 2012), pp. 265–279, June 2012

    Google Scholar 

  4. Blackwell, D.: Comparison of experiments. In: Proceedings Second Berkeley Symposium on Mathematical Statistics and Probability, pp. 93–102 (1951)

    Google Scholar 

  5. Bordenabe, N.E., Smith, G.: Correlated secrets in information flow. Proc. CSF 2016, 93–104 (2016)

    Google Scholar 

  6. Dijkstra, E.W.: A Discipline of Programming. Prentice-Hall, New Jersey (1976)

    MATH  Google Scholar 

  7. Goguen, J.A., Meseguer, J.: Unwinding and inference control. In: Proceedings of IEEE Symposium on Security and Privacy, pp. 75–86. IEEE Computer Society (1984)

    Google Scholar 

  8. Landauer, J., Redmond, T.: A lattice of information. In: Proceedings of 6th IEEE CSFW 1993, pp. 65–70, June 1993

    Google Scholar 

  9. McIver, A., Meinicke, L., Morgan, C.: Compositional closure for Bayes risk in probabilistic noninterference. In: Abramsky, S., Gavoille, C., Kirchner, C., Meyer auf der Heide, F., Spirakis, P.G. (eds.) ICALP 2010. LNCS, vol. 6199, pp. 223–235. Springer, Heidelberg (2010). doi:10.1007/978-3-642-14162-1_19

    Chapter  Google Scholar 

  10. McIver, A., Meinicke, L., Morgan, C.: A Kantorovich-Monadic powerdomain for information hiding, with probability and nondeterminism. In: Proceedings LICS 2012 (2012)

    Google Scholar 

  11. McIver, A., Meinicke, L., Morgan, C.: Hidden-Markov program algebra with iteration. Mathematical Structures in Computer Science (2014)

    Google Scholar 

  12. McIver, A., Morgan, C., Meinicke, L., Smith, G., Espinoza, B.: Abstract channels, gain functions and the information order. In: FCS 2013 (2013). http://prosecco.gforge.inria.fr/personal/bblanche/fcs13/fcs13proceedings.pdf

  13. McIver, A., Morgan, C., Rabehaja, T.: Abstract hidden Markov models: a monadic account of quantitative information flow. In: Proceedings LICS 2015 (2015)

    Google Scholar 

  14. McIver, A., Morgan, C., Smith, G., Espinoza, B., Meinicke, L.: Abstract channels and their robust information-leakage ordering. In: Abadi, M., Kremer, S. (eds.) POST 2014. LNCS, vol. 8414, pp. 83–102. Springer, Heidelberg (2014). doi:10.1007/978-3-642-54792-8_5

    Chapter  Google Scholar 

  15. Morgan, C.: The shadow knows: refinement of ignorance in sequential programs. In: Uustalu, T. (ed.) MPC 2006. LNCS, vol. 4014, pp. 359–378. Springer, Heidelberg (2006). doi:10.1007/11783596_21

    Chapter  Google Scholar 

  16. Morgan, C.C., Knows, T.S.: Refinement of ignorance in sequential programs. Sci. Comput. Program. 74(8), 629–653 (2009). Treats Oblivious Transfer

    Article  Google Scholar 

  17. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 623–656 (1948)

    Article  MathSciNet  MATH  Google Scholar 

  18. Smith, G.: On the foundations of quantitative information flow. In: Alfaro, L. (ed.) FoSSaCS 2009. LNCS, vol. 5504, pp. 288–302. Springer, Heidelberg (2009). doi:10.1007/978-3-642-00596-1_21

    Chapter  Google Scholar 

  19. Wirth, N.: Program development by stepwise refinement. Commun. ACM 14(4), 221–227 (1971)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Carroll Morgan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Morgan, C. (2017). A Demonic Lattice of Information. In: Gibson-Robinson, T., Hopcroft, P., Lazić, R. (eds) Concurrency, Security, and Puzzles. Lecture Notes in Computer Science(), vol 10160. Springer, Cham. https://doi.org/10.1007/978-3-319-51046-0_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-51046-0_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-51045-3

  • Online ISBN: 978-3-319-51046-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics