Skip to main content

Part of the book series: Human-Computer Interaction Series ((HCIS,volume 2))

  • 117 Accesses

Abstract

In this chapter, the solutions developed so far will be juxtaposed with the requirements listed in Chapter 5. In cases where no appropriate solutions for meeting the requirements were found, the risks involved in employing a user adaptive system are described.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. See Chapter 6.2.2.1.

    Google Scholar 

  2. See http://www.csee.umbc.edu/kqml/software/.

  3. See Chapter 5.2.1.

    Google Scholar 

  4. Rivest, Shamir, and Adleman

    Google Scholar 

  5. See Request for Comments (RFC) 2459: Internet X.509 Public Key Infrastructure Certificate and CRL Profile.

    Google Scholar 

  6. See Request for Comments (RFC) 2459: Internet X.509 Public Key Infrastructure Certificate and CRL Profile.

    Google Scholar 

  7. See [Schneier, 1996, Chap. 8] or [Menezes et al., 1997, Chap. 13] for key management techniques.

    Google Scholar 

  8. See [Hirsch, 1997] for a detailed discussion of the SSL handshake phase.

    Google Scholar 

  9. See http://www2.psy.uq.edu.au/~ftp/Crypto/ and OpenSSL (http://www.openssl.org/).

  10. See [Hirsch, 1997] for a detailed discussion of the SSL handshake phase (and encryption process).

    Google Scholar 

  11. See Request for Comments (RFC) 1422: Privacy Enhancement for Internet Electronic Mail: Part II: Certificate-Based Key Management.

    Google Scholar 

  12. The additional parameters explained in the previous section (for instance, the certificate file) may be predefined within SKAPI for an invariable communication partner and need not be provided by the user modeling component using SKAPI. Thus, the additional parameters may be omitted.

    Google Scholar 

  13. A model of a given set of axioms exists only for a set of axioms which are free of conflict. An existing model represents only one of potentially many assignments for the syntactical elements of the set of axioms assigned by an interpretation function. The model thereby is a concretization (see [Chang and Keisler, 1990]).

    Google Scholar 

  14. For instance, in a user model, anonymous data about sensitive characteristics can constitute one category and identifying information about the user a second category.

    Google Scholar 

  15. For Goguen and Meseguer (see [Goguen and Meseguer, 1982] and [Goguen and Meseguer, 1984]), the term user corresponds to a user model client which accesses information from a user model and not to the user being modeled.

    Google Scholar 

  16. reflexive, transitive, antisymmetric

    Google Scholar 

  17. Because all components involved in the security model have to comply with such conditions, these models are often called mandatory security models (MAC models).

    Google Scholar 

  18. See [Shannon, 1949], [Blahut, 1987], or [Denning, 1982] for entropy and conditional entropy.

    Google Scholar 

  19. For this example, the user model consists of a set of first order logic formulas, see [Pohl, 1998, Chap. 3].

    Google Scholar 

  20. See [Denning, 1982, Chap. 5] or [Birkhoff, 1962] for a definition.

    Google Scholar 

  21. reflexive, transitive, antisymmetric

    Google Scholar 

  22. The relation of access modes can be modified at the users’ discretion.

    Google Scholar 

  23. For instance, two permissions read-identifying and read-anonymous can distinguish read access to user model entries which make it possible either to identify the user or to maintain his anonymity.

    Google Scholar 

  24. For instance, the described security models focus either on the requirement for confidentiality or on the requirement for integrity (see Chapter 7.2.2.1). For user models, an orientation either on confidentiality or on integrity would both yield negative results. In the first case, user model clients which are classified to handle confidential information are not able to correct user model information which is accessible to user model clients which are classified to handle less confidential information — the integrity of user model information on lower confidentiality levels can therefore not be maintained by clients on higher confidentiality levels. In the latter case, clients which are considered to foster the integrity better must be able to supersede a greater set of user model information than clients which are less reliable — therefore clients on a high integrity level will keep their information less confidential. For user modeling purposes, a mixture of these two orientations will be suitable which affords a security model that can adapt to varying policies.

    Google Scholar 

  25. The conflict between confidentiality and integrity inherent in such security models is discussed in [Wiseman, 1991] and on p. 120. Possible ways of resolving this conflict are explored in Chapter 7.2.2.4 and Chapter 7.1.2.4.

    Google Scholar 

  26. Common requirements of user adaptive systems used for the examples in this section are enclosed in quotation marks.

    Google Scholar 

  27. See also the security classes of example 7.10 on p. 114.

    Google Scholar 

  28. Permissions for this example are defined on pp. 128 and 136.

    Google Scholar 

  29. More precisely: permission names are related to role names.

    Google Scholar 

  30. Permission names are motivated by KQML performatives, see Chapter 6.2.2.

    Google Scholar 

  31. Sessions are defined in the following section.

    Google Scholar 

  32. Arrows are in opposite direction.

    Google Scholar 

  33. Arrows are in opposite direction.

    Google Scholar 

  34. See next section for the definition of reference.

    Google Scholar 

  35. RBAC/Web Release 1.1, http://csrc.nist.gov/rbac/

  36. National Institute of Standards and Technology, Maryland, USA

    Google Scholar 

  37. The identification and authentication of the role administrator is handled via the web server.

    Google Scholar 

  38. Users of the RBAC model correspond to application systems for the scope of this work, but might also include the user of the user adaptive system.

    Google Scholar 

  39. The domain contains not only independent role hierarchies but independent RBAC models (see RBAC 0 , RBAC 1 and RBAC 2 in Chapter 7.1.2.4).

    Google Scholar 

  40. Compare the URLs of Figures 7.8 and 7.11: domain interest at http://terra.gmd.de:8080/INTEREST/login domain IntAppl at http://terra.gmd.de:8080/OFFICEAPPLICATION/login

  41. See the definition of the relation RH on p. 120.

    Google Scholar 

  42. See the definition of the relation VA on p. 119.

    Google Scholar 

  43. See the definition of the relation PA on p. 119.

    Google Scholar 

  44. See p. 119.

    Google Scholar 

  45. See Equation 7.18 on p. 128.

    Google Scholar 

  46. See Chapter 6.2.2.

    Google Scholar 

  47. See Chapter 8.2 for further examples.

    Google Scholar 

  48. In this implementation, a session s i (see RBAC 0 on p. 119) lasts for one access request and user(si) results in the name of the (possibly authenticated) sender of the SKQML message. The maximum set of possible roles is applied for user(s i ). 49 See also the example described in Chapter 8.2 on p. 164.

    Google Scholar 

  49. See Equations 7.15 on p. 125, 7.16 on p. 126, and 7.17 on p. 127.

    Google Scholar 

  50. See [Ferraiolo et al., 1999], [Gavrila and Barkley, 1998], [Goh and Baldwin, 1998], [Lawrence, 1993], [Moffett, 1998], [Nyanchama and Osborn, 1994], [Sandhu et al., 1996], and [Sandhu and Park, 1998].

    Google Scholar 

  51. In contrast to this simple example, Zurfluh argues that persons might assume up to 100 roles in interaction with their environment [Zurfluh, 1998, p. 50].

    Google Scholar 

  52. A concept can be defined as the triple (designator, intention, extension).

    Google Scholar 

  53. Theory and Applications for General User/Learner-modeling Systems

    Google Scholar 

  54. See p. 110.

    Google Scholar 

  55. See p. 114.

    Google Scholar 

  56. See Chapter 7.1.2.3 and Chapter 7.1.4.

    Google Scholar 

  57. See Chapter 7.2.1.5.

    Google Scholar 

  58. organized techniques for theorem-proving and effective research 60 See [Genesereth and Nilsson, 1987].

    Google Scholar 

  59. See Chapter 5.1.2.

    Google Scholar 

  60. See Chapter 7.1.2.2.

    Google Scholar 

  61. See Chapter 6.1.2. 64 See Chapter 4.4.

    Google Scholar 

  62. See Chapter 4.4 and Chapter 7.2.2.4. 66 See Chapter 6.1.1. 67 See Chapter 7.2.2.2. 68 See Chapter 7.2.2.1.

    Google Scholar 

  63. See Chapter 6.2.2 and p. 79.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Schreck, J. (2003). Solutions for Security. In: Security and Privacy in User Modeling. Human-Computer Interaction Series, vol 2. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-0377-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-94-017-0377-2_8

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-90-481-6223-9

  • Online ISBN: 978-94-017-0377-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics