Reconfigurable Tamper-resistant Hardware Support Against Insider Threats: The Trusted ILLIAC Approach

  • Ravishankar K. Iyer
  • Paul Dabrowski
  • Nithin Nakka
  • Zbigniew Kalbarczyk
Part of the Advances in Information Security book series (ADIS, volume 39)


“An insider attack, sometimes referred to as an inside job, is defined as a crime perpetrated by, or with the help of, a person working for or trusted by the victim” [1]. This one-sided relationship of trust makes the insider attacks particularly insidious and difficult to protect against. This article motivates the need for secure and tamper-resistant storage of the secret information that is impenetrable even by the operating system and efficient ways of meeting this need. It highlights innovative new work being developed in the context of the Trusted ILLIAC project at the University of Illinois. A progression of techniques is presented providing increasing levels of security starting from a purely software-based approach, to hardware/software partitioned and hardware-only mechanisms. This is to guard the system effectively against insiders having increasing levels of intrusive access from user-level, administrative up to even physical access to the system under threat of attack. Techniques covered include software- and hardwarebased memory randomization, hardware for a threshold cryptography enabled mechanism to allow tamper-proof key management and support the software technique. Further, we describe an Information Flow Signatures based technique to provide runtime data integrity guarantees. Reconfigurable hardware is used to ensure the secure computation of critical data. In order to enable this trusted computing hardware we explore requirements for securely initializing it under the threat of an insider attack. The unique advantage of a hardware implemented mechanism is that the secret, either the key or the code that operates on securitycritical data, cannot be revealed or modified even by the operating system.


Critical Data Trusted Platform Module Threat Model Inside Attack Security Technique 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    N. Einwechter. “The Enemy Inside the Gates: Preventing and Detecting Insider Attacks,” Scholar
  2. [2]
    L. A. Gordon, M. P. Loeb, W. Lucyshyn and R. Richardson. “2005 CSI/FBI Computer Crime and Security Survey”. Scholar
  3. [3]
    Trusted ILLIAC: a large, demonstrably trusted cluster computing platform. Scholar
  4. [4]
    J. Xu, Z. Kalbarczyk, and R. K. Iyer. “Transparent runtime randomization for security,” In 22nd International Symposium on Reliable Distributed Systems (SRDS’03), pages 260–269, Florence, Italy, Oct. 2003. IEEE Computer Society, IEEE Press.Google Scholar
  5. [5]
    Saggese, G. P., Basile, C., Romano, L., Kalbarczyk, Z., and Iyer, R. K. 2004. “Hardware Support for High Performance, Intrusion- and Fault-Tolerant Systems,” In Proceedings of the 23rd IEEE international Symposium on Reliable Distributed Systems (Srds’04) - Volume 00 (October 18 - 20, 2004). SRDS. IEEE Computer Society, Washington, DC, 195-204.Google Scholar
  6. [6]
    Hovav Shacham, Matthew Page, Ben Pfaff, Eu-Jin Goh, Nagendra Modadugu, Dan Boneh. “On the effectiveness of address-space randomization,” ACM Conference on Computer and Communications Security 2004: 298-307Google Scholar
  7. [7]
    S. Farrell and R. Housley, “An internet attribute certificate profile for authorization,” IETF — RFC 3281, 2002.Google Scholar
  8. [8]
    V. Shoup, “Practical threshold signatures,” LNCS, vol. 1807, pp. 207– 218, 2000.Google Scholar
  9. [9]
    P. L. Montgomery, “Modular multiplication without trial division,” Math. of Computation, vol. 44, no. 170, pp. 519–521, 1985CrossRefGoogle Scholar
  10. [10]
    Mazzeo, et al., “An FPGA-based implementation of the RSA algorithm,” in Proc. of DATE, 2003.Google Scholar
  11. [11]
    T. Wollinger, J. Guajardo, and C. Paar, “Cryptography on FPGAs: State of the art implementations and attacks,” ACM Trans. on Embedded Computing Systems, 2003.Google Scholar
  12. [12]
    Xilinx, “Configuration issues: Power-up, volatility, security, battery back-up,” Application Note XAPP 092, 1997.Google Scholar
  13. [13]
    M. Joye, “Recovering lost efficiency of exponentiation algorithms on smart cards,” Electronics Letters, vol. 38, no. 19, pp. 1095–1097, 2002.CrossRefGoogle Scholar
  14. [14]
    Ross Anderson, Mike Bond, Jolyon Clulow and Sergei Skorobogatov. “Cryptographic Processors – A Survey”. University of Cambridge Technical Report: ISSN 1476-2986. 2005.Google Scholar
  15. [15]
    Joan G. Dyer, Mark Lindemann, Ronald Perez, Reiner Sailer, Leendertvan Doorn, Sean W. Smith, Steve Weingart. “Building the IBM 4758 Secure Coprocessor,” IEEE Computer, October 2001.Google Scholar
  16. [16]
    “TPM Main Part 1 Design Principles Specification Version 1.2”. Level 2 Revision 103. 9 July 2007Google Scholar
  17. [17]
    Sean W. Smith. “Outbound Authentication for Programmable Secure Coprocessors,” Proceedings of the 7th European Symposium on Research in Computer Security, 2001.Google Scholar
  18. [18]
    R. K. Iyer, Z. Kalbarczyk, K. Pattabiraman, W. Healey, W. M. Hwu, P. Klemperer, R. Farivar. ”Toward Application-Aware Security and Reliability,” IEEE Security & Privacy, Volume 5 Number 1, Jan 2007.Google Scholar
  19. [19]
    Kiriansky, V., Bruening, D., and Amarasinghe, S. “Secure execution via program shepherding,” In Proc. of the 11th USENIX Security Symposium (Aug. 2002).Google Scholar
  20. [20]
    Miguel Castro, Manuel Costa, and Tim Harris. “Securing software by enforcing data-flow integrity,” In Symposium on Operating System Design and Implementation (OSDI), Seattle, WA, Nov. 2006.Google Scholar
  21. [21]
    Abadi, M., Budiu, M., Erlingsson, U., and Ligatti, J. “Control-flow Integrity: Principles, implementations, and applications,” In Proc. ACM Computer and Communications Security, Nov. 2005.Google Scholar
  22. [22]
    Necula, G. C., McPeak, S., and Weimer, W. 2002. “CCured: type-safe retrofitting of legacy code,” In Proceedings of the 29th ACM SIGPLAN-SIGACT Symposium on Principles of Pro- gramming Languages (Portland, Oregon, January 16 - 18, 2002). POPL ’02. ACM Press, New York, NY, 128-139.Google Scholar
  23. [23]
    Dhurjati, D., Kowshik, S., and Adve, V., “SAFECode: enforcing alias analysis for weakly typed languages,” In Proceedings of the 2006 ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI ’06). ACM Press, PP. 144-157.Google Scholar
  24. [24]
    Boneh, D., DeMillo, R. A., & Lipton, R. J. “On the Importance of Eliminating Errors in Cryptographic Computations,” Journal of Cryptology: The Journal of the International Association for Cryptologic Research, vol. 14, pp. 101-119, 2001.MATHMathSciNetGoogle Scholar
  25. [25]
    Mark Weiser. “Program slicing,” In Proceedings of the 5th International Conference on Software Engineering, pages 439-449. IEEE Computer Society Press, 1981.Google Scholar
  26. [26]
    Adam Boileau. “Hit by a Bus: Physical Access Attacks with Firewire,” Presented at Ruxcon 2k6, 2006.Google Scholar
  27. [27]
    UIUC Open-IMPACT Effort. “The OpenIMPACT IA-64 Compiler”. http://gelato.uiuc.eduGoogle Scholar
  28. [28]
    Jiri Gaisler, Gaisler Research. “Leon 3 Synthesizable Processor”. http://www.gaisler.comGoogle Scholar
  29. [29]
    W. Healey, K. Pattabiraman, S. Ryoo, P. Dabrowski, WM. Hwu, Z. Kalbarczyk, R. K. Iyer. “Ensuring Critical Data Integrity via Information Flow Signatures,” University of Illinois Technical Report. UILU-ENG-07-2216. CRHC-07-09.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2008

Authors and Affiliations

  • Ravishankar K. Iyer
    • 1
  • Paul Dabrowski
    • 1
  • Nithin Nakka
    • 1
  • Zbigniew Kalbarczyk
    • 1
  1. 1.Coordinated Science LaboratoryUniversity of Illinois at Urbana-Champagne

Personalised recommendations