Predictably Deterrable? The Case of System Trespassers

  • David MaimonEmail author
  • Alexander Testa
  • Bertrand Sobesto
  • Michel Cukier
  • Wuling Ren
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11637)


Can computing environments deter system trespassers and increase intruders’ likelihood to cover their tracks during the progression of a system trespassing event? To generate sufficient empirical evidence to answer this question, we designed a series of randomized field trials using a large set of target computers built for the sole purpose of being infiltrated. We configured these computers to present varying levels of ambiguity regarding the presence of surveillance in the system, and investigated how this ambiguity influenced system trespassers’ likelihood to issue clean tracks commands. Findings indicate that the presence of unambiguous signs of surveillance increases the probability of clean tracks commands being entered on the system. Nevertheless, even when given clear signs of detection, we find that intruders are less likely to use clean tracks commands in the absence of subsequent presentations of sanction threats. These results indicate that the implementation of deterring policies and tools in cyber space could nudge system trespassers to exhibit more cautiousness during their engagement in system trespassing events. Our findings also emphasize the relevance of social-science models in guiding cyber security experts’ continuing efforts to predict and respond to system trespassers’ illegitimate online activities.


System trespassing Deterrence Randomized trial Ambiguity 



This research was conducted with the support of the National Science Foundation Award 1223634.


  1. 1.
    Furnell, S.: Cybercrime: Vandalizing the Information Society. Addison-Wesley, Boston (2002)zbMATHGoogle Scholar
  2. 2.
    Online Trust Alliance: Data Protection and Breech: Readiness Guide. Online Trust Alliance (2014)Google Scholar
  3. 3.
    Storm, D.: MEDJACK: hackers hijacking medical devices to create backdoors in hospital networks. Computer World (2015).
  4. 4.
    Riffkin, R.: Hacking Tops List of Crimes Americans Worry about Most. Gallup Poll News Service (2014).
  5. 5.
    The Comprehensive National Cybersecurity Initiative. The White House.
  6. 6.
    Becker, G.: Crime and punishment: an economic approach. J. Polit. Econ. 76, 169–217 (1968)CrossRefGoogle Scholar
  7. 7.
    Gibbs, J.: Crime, Punishment, and Deterrence. Elsevier Scientific Publishing Company, New York (1975)Google Scholar
  8. 8.
    Harknett, R.: Information warfare and deterrence. Parameters 26, 93–107 (1996)Google Scholar
  9. 9.
    Harknett, R., Callaghan, J., Kauffman, R.: Leaving deterrence behind: war-fighting and national cybersecurity. J. Homel. Secur. Emerg. Manag. 7(1), 1–24 (2010)Google Scholar
  10. 10.
    Denning, D., Baugh, W.: Hiding crimes in cyberspace. In: Thomas, D., Loader, D. (eds.) Cybercrime: Law Enforcement, Security and Surveillance in the Information Age, pp. 105–132. Routledge, London (2000)Google Scholar
  11. 11.
    Goodman, W.: Cyber deterrence: tougher in theory than in practice? Strategic Studies Quarterly Fall, pp. 102–135 (2010)Google Scholar
  12. 12.
    Welsh, B., Farrington, D.: Making Public Places Safer: Surveillance and Crime Prevention. Oxford University Press, New York (2009)CrossRefGoogle Scholar
  13. 13.
    Welsh, B., Mudge, M., Farrington, D.: Reconceptualizing public area surveillance and crime prevention: security guards, place managers and defensible space. Secur. J. 23, 299–319 (2010)CrossRefGoogle Scholar
  14. 14.
    Nagin, D.: Deterrence in the twenty-first century. Crime Justice 42(1), 199–263 (2013)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Sherman, L.: Police crackdowns: initial and residual deterrence. In: Tonry, M., Morris, M. (eds.) Crime and Justice: An Annual Review of Research, vol. 12, pp. 1–48. University of Chicago Press, Chicago (1990)CrossRefGoogle Scholar
  16. 16.
    Stoneburner, G., Goguen, A., Feringa, A.: Risk Management Guide for Information Technology Systems. NIST Special Publication 800:30 (2002)Google Scholar
  17. 17.
    Png, I., Wang, Q.: Information security: facilitating user precautions vis-à-vis enforcement against attackers. J. Manag. Inf. Syst. 26, 97–121 (2009)CrossRefGoogle Scholar
  18. 18.
    Maimon, D., Alper, M., Sobesto, B., Cukier, M.: Restrictive deterrent effect of a warning banner in an attacked computer system. Criminology 52, 33–59 (2014)CrossRefGoogle Scholar
  19. 19.
    Jacobs, B., Cherbonneau, M.: Auto theft and restrictive deterrence. Justice Q. 31(2), 1–24 (2014)CrossRefGoogle Scholar
  20. 20.
    Jacobs, B.: Crack dealers’ apprehension avoidance techniques: a case of restrictive deterrence. Justice Q. 13, 359–381 (1996)CrossRefGoogle Scholar
  21. 21.
    Wright, R., Decker, S.: Burglars on the Job. Northeastern University Press, Boston (1994)Google Scholar
  22. 22.
    Clarke, R.V.: Situational crime prevention. Crime Justice 19, 91–150 (1995)CrossRefGoogle Scholar
  23. 23.
    Cozens, P., Love, T.: A review and current status of crime prevention through environmental design (CPTED). J. Plann. Lit. 30(4), 393–412 (2015)CrossRefGoogle Scholar
  24. 24.
    Ellsberg, D.: Risk, ambiguity, and the Savage axioms. Q. J. Econ. 75(4), 643–669 (1961)CrossRefGoogle Scholar
  25. 25.
    Kahneman, D., Tversky, A.: Prospect theory: an analysis of decision under risk. Econometrica 47(2), 263–291 (1979)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Trautmann, S., Vieider, F., Wakker, P.: Causes of ambiguity aversion: known versus unknown preferences. J. Risk Uncertain. 36(3), 225–243 (2008)CrossRefGoogle Scholar
  27. 27.
    Becker, S., Brownson, F.: What price ambiguity? Or the role of ambiguity in decision-making. J. Polit. Econ. 72(1), 62–73 (1964)CrossRefGoogle Scholar
  28. 28.
    Jacobs, B.: Deterrence and deterrability. Criminology 48(2), 417–441 (2010)CrossRefGoogle Scholar
  29. 29.
    Baillon, A., Bleichrodt, H.: Testing ambiguity models through the measurement of probabilities for gains and losses. Am. Econ. J. 7(2), 77–100 (2015)Google Scholar
  30. 30.
    Engebretson, P.: The Basics of Hacking and Penetration Testing: Ethical Hacking and Penetration Testing Made Easy. Elsevier, Waltham (2013)Google Scholar
  31. 31.
    National Institute for Standards and Technology: Recommended Security Controls for Federal Information Systems and Organization (U.S. Department of Commerce) (2009)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • David Maimon
    • 1
    Email author
  • Alexander Testa
    • 2
  • Bertrand Sobesto
    • 3
  • Michel Cukier
    • 4
  • Wuling Ren
    • 5
  1. 1.Department of Criminology and Criminal JusticeGeorgia State UniversityAtlantaUSA
  2. 2.Department of Criminal JusticeUniversity of Texas at San AntonioSan AntonioUSA
  3. 3.Division of Information TechnologyUniversity of MarylandCollege ParkUSA
  4. 4.A. James Clark School of EngineeringUniversity of MarylandCollege ParkUSA
  5. 5.College of Computer and Information EngineeringZhejiang Gongshang UniversityHangzhouChina

Personalised recommendations