Combatting Insider Threats

Chapter
Part of the Advances in Information Security book series (ADIS, volume 49)

Abstract

Risks from insider threats are strongly context dependent, and arise in many ways at different layers of system abstraction for different types of systems. We discuss various basic characteristics of insider threats, and consider approaches to the development and use of computer-related environments that require systems and networking to be trustworthy in spite of insider misuse. We also consider future research that could improve both detectability, prevention, and response. This chapter seeks to cope with insider misuse in a broad range of application domains— for example, critical infrastructures, privacy-preserving database systems, financial systems, and interoperable health-care infrastructures. To illustrate this, we apply the principles considered here to the task of detecting and preventing insider misuse in systems that might be used to facilitate trustworthy elections. This discussion includes an examination of the relevance of the Saltzer-Schroeder-Kaashoek security principles and the Clark-Wilson integrity properties for end-to-end election integrity. Trustworthy system developments must consider insider misuse as merely one set of threats that must be addressed consistently together with many other threats such as penetrations, denials of service, system faults and failures, and other threats to survivability. In addition, insider misuse cannot be realistically addressed unless significant improvements are made in the trustworthiness of component systems and their networking as well as their predictably trustworthy compositions into enterprise solutions— architecturally, developmentally, and operationally.

Keywords

Clay Encapsulation Sonal Folk Karen 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    K.J. Biba. Integrity considerations for secure computer systems. Technical Report MTR 3153, The Mitre Corporation, Bedford, Massachusetts, June 1975. Also available from USAF Electronic Systems Division, Bedford, Massachusetts, as ESD-TR-76-372, April 1977.Google Scholar
  2. 2.
    M. Bishop. Position: ’Insider’ is relative. In Proceedings of the 2005 New Security Paradigms Workshop, pages 77–78, Lake Arrowhead, California, October 2005.Google Scholar
  3. 3.
    M. Bishop, S. Engle, C. Gates, S. Peisert, and S. Whalen. We have met the enemy and he is us. In Proceedings of the 2008 New Security Paradigms Workshop, Olympic Valley, California, 2008.Google Scholar
  4. 4.
    D.D. Clark and D.R. Wilson. A comparison of commercial and military computer security policies. In Proceedings of the 1987 Symposium on Security and Privacy, pages 184–194, Oakland, California, April 1987. IEEE Computer Society.Google Scholar
  5. 5.
    F.J. Corbató. On building systems that will fail (1990 Turing Award Lecture, with a following interview by Karen Frenkel). Communications of the ACM, 34(9):72–90, September 1991.CrossRefGoogle Scholar
  6. 6.
    R.C. Daley and P.G. Neumann. A general-purpose file system for secondary storage. In AFIPS Conference Proceedings, Fall Joint Computer Conference, pages 213–229. Spartan Books, November 1965.Google Scholar
  7. 7.
    V.D. Gligor et al. Design and implementation of Secure Xenix[TM]. In Proceedings of the 1986 Symposium on Security and Privacy, Oakland, California, April 1986. IEEE Computer Society. also in IEEE Transactions on Software Engineering, vol. SE-13, 2, February 1987, 208–221.Google Scholar
  8. 8.
    P. A. Karger. Limiting the damage potential of discretionary Trojan horses. In Proceedings of the 1987 Symposium on Security and Privacy, pages 32–37, Oakland, California, April 1987. IEEE Computer Society.Google Scholar
  9. 9.
    C.E. Landwehr, A.R. Bull, J.P. McDermott, and W.S. Choi. A taxonomy of computer program security flaws, with examples. Technical report, Center for Secure Information Technology, Information Technology Division, Naval Research Laboratory, Washington, D.C., November 1993.Google Scholar
  10. 10.
    D. Maughan et al. A roadmap for cybersecurity research. Technical report, Department of Homeland Security, November 2009.Google Scholar
  11. 11.
    P.G. Neumann. Computer-Related Risks. ACM Press, New York, and Addison-Wesley, Reading, Massachusetts, 1995.Google Scholar
  12. 12.
    P.G. Neumann. Practical architectures for survivable systems and networks. Technical report, Final Report, Phase Two, Project 1688, SRI International, Menlo Park, California, June 2000. http://www.csl.sri.com/neumann/survivability.html. Google Scholar
  13. 13.
    P.G. Neumann. Principled assuredly trustworthy composable architectures. Technical report, Computer Science Laboratory, SRI International, Menlo Park, California, December 2004. http://www.csl.sri.com/neumann/chats 4.html, .pdf, and .ps.Google Scholar
  14. 14.
    P.G. Neumann. Reflections on system trustworthiness. In Marvin Zelkowitz, editor, Advances in Computers, volume 70, pages 269–310. Elsevier Inc., 2007.Google Scholar
  15. 15.
    P.G. Neumann. Security and privacy in the employment eligibility verification system (eevs) and related systems. In Congressional Record, Washington, DC, Jun 7 2007. U.S. House of Representatives.Google Scholar
  16. 16.
    P.G. Neumann. Illustrative risks to the public in the use of computer systems and related technology, index to RISKS cases. Technical report, Computer Science Laboratory, SRI International, Menlo Park, California, 2009. Updated now and then:http://www.csl.sri.com/neumann/illustrative.html; also in .ps and .pdf form for printing in a denser format.
  17. 17.
    P.G. Neumann, R.S. Boyer, R.J. Feiertag, K.N. Levitt, and L. Robinson. A Provably Secure Operating System: The system, its applications, and proofs. Technical report, Computer Science Laboratory, SRI International, Menlo Park, California, May 1980. 2nd edition, Report CSL-116.Google Scholar
  18. 18.
    P.G. Neumann and P.A. Porras. Experience with EMERALD to date. In Proceedings of the First USENIX Workshop on Intrusion Detection and Network Monitoring, pages 73–80, Santa Clara, California, April 1999. USENIX. Best paper.Google Scholar
  19. 19.
    P.A. Porras and P.G. Neumann. EMERALD: Event Monitoring Enabling Responses to Anomalous Live Disturbances. In Proceedings of the Nineteenth National Computer Security Conference, pages 353–365, Baltimore, Maryland, 22-25 October 1997. NIST/NCSC.Google Scholar
  20. 20.
    J.H. Saltzer. Protection and the control of information sharing in Multics. Communications of the ACM, 17(7):388–402, July 1974.Google Scholar
  21. 21.
    J.H. Saltzer and F. Kaashoek. Principles of Computer System Design. Morgan Kauffman, 2009. Chapters 1-6 only. Chapters 7-11 are online: http://ocw.mit.edu/Saltzer-Kaashoek .Google Scholar
  22. 22.
    J.H. Saltzer and M.D. Schroeder. The protection of information in computer systems. Proceedings of the IEEE, 63(9):1278–1308, September 1975.Google Scholar
  23. 23.
    S. Stolfo, S. Bellovin, S. Hershkop, S. Sinclair, and S. Smith. Insider Attack and Cyber Security: Beyond the Hacker. Springer, 2008.Google Scholar
  24. 24.
    K.-P. Yee. Building Reliable Voting Machine Software. PhD thesis, University of California, Berkeley, 2007. Technical Report 2007-167; see also Technical Note 2007-136 for the security review; http://pvote.org.Google Scholar
  25. 25.
    L.S. Zegans. The psychology of risks. Communications of the ACM, 51(1):152, January 2008. Inside Risks column.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Principled Systems Group, Computer Science Lab, SRI InternationalMenlo ParkUSA

Personalised recommendations