Physical Attacks and Tamper Resistance



Many semiconductor chips used in a wide range of applications require protection against physical attacks or tamper resistance. These attacks assume that a direct access to the chip is possible with either establishing electrical connections to signal wires or at least doing some measurements. The importance of protection against physical attacks is dictated by the amount of valuable and sensitive information stored on the chip. This could be secret data or company secrets and intellectual property (IP), electronic money for service access, or banking smartcards. The security in chips serves to deter prospective attackers from performing unauthorized access and benefiting from it. There are many areas that rely on tamper resistance of silicon chips. One of the first was car industry with theft protection and car alarms. Then in the early 1990s service providers such as PayTV, satellite TV, and utility companies realized that their service can be stolen if the access and payment cards are not properly protected. From the late 1990s home entertainment companies realized that their game consoles became the target of dishonest users who wanted to run illegal copies of the games. These days many device manufacturers from computer peripherals and mobile phones to printers and computers are worried about possible IP theft by third parties – either competitors or subcontractors. All the above challenges force hardware engineers to find secure solutions – either better protected off-the-shelf chips or their own custom chips. As in most cases it is impractical to block direct access to the device and its components, protection against physical attacks became the essential part of the system design. These days we have a continuous battle between the manufacturers who invent new security solutions learning their lessons from previous mistakes and the hacker community which is constantly trying to break the protection in various devices. Both sides are also constantly improving their knowledge and experience. In this endless war, the front line shifts forward and backward regularly. Deep down, the problem concerns both economics and law. On the one hand, when dishonest people try to steal property, there will be a demand to increase security. On the other, reverse engineering was always part of technological progress, helping to design compatible products and improve existing ones. The dividing line between legal (reverse engineering) and illegal (piracy) is difficult.


Reverse Engineering Chip Surface Central Processor Unit SRAM Cell Security Protection 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Abraham DG, Dolan GM, Double GP, Stevens JV (1991) Transaction Security System. IBM Syst J 30(2): 206–229CrossRefGoogle Scholar
  2. 2.
    U.S. Department of Commerce. Security requirements for cryptographic modules. Accessed 10 January 2011
  3. 3.
    Common Criteria Evaluation and Validation Scheme. Accessed 10 January 2011
  4. 4.
    Skorobogatov S (2005) Semi-invasive attacks – a new approach to hardware security analysis. In: Technical Report UCAM-CL-TR-630, University of Cambridge, Computer Laboratory, April 2005Google Scholar
  5. 5.
    Kocher PC (1996) Timing attacks on implementations of Diffie-Hellman, RSA, DSS, and other systems. Advances in Cryptology, CRYPTO’96, LNCS, vol 1109. Springer-Verlag, Berlin, Heidelberg, New York, pp 104–113Google Scholar
  6. 6.
    Dhem J-F, Koeune F, Leroux P-A, Mestre P, Quisquater J-J, Willems J-L, A practical implementation of the timing attack. In: Proceedings of CARDIS’98, Smart Card Research and Advanced Applications, 1998Google Scholar
  7. 7.
    Chaum D (1983) Blind signatures for untraceable payments. Advances in Cryptology: Proceedings of Crypto 82. Plenum Press, NY, USA, pp 199–203Google Scholar
  8. 8.
    Goodspeed T (2008) Side-channel Timing Attacks on MSP430 Microcontrollers. Black Hat, USAGoogle Scholar
  9. 9.
    Quisquater J-J, Samyde D (2002) Eddy current for magnetic analysis with active sensor. In: UCL, Proceedings of Esmart 2002 3rd edn., Nice, France, September 2002Google Scholar
  10. 10.
    Skorobogatov S (2002) Low temperature data remanence in static RAM. In: Technical Report UCAM-CL-TR-536, University of Cambridge, Computer Laboratory, June 2002Google Scholar
  11. 11.
    Skorobogatov S (2005) Data remanence in flash memory devices. Cryptographic Hardware and Embedded Systems Workshop (CHES 2005), LNCS 3659. Springer, Berlin, Heidelberg, New York, pp 339–353Google Scholar
  12. 12.
    Wagner LC (1999) Failure Analysis of Integrated Circuits: Tools and Techniques. Kluwer Academic Publishers, Dordrecht (Hingham, MA)Google Scholar
  13. 13.
    Chipworks. Accessed 10 January 2011
  14. 14.
    Blythe S, Fraboni B, Lall S, Ahmed H, de Riu U (1993) Layout reconstruction of complex silicon chips. IEEE J Solid-State Circuits 28(2): 138–145CrossRefGoogle Scholar
  15. 15.
    Fournier JJ-A, Loubet-Moundi P (2010) Memory address scrambling revealed using fault attacks. In: 7th Workshop on Fault Diagnosis and Tolerance in Cryptography (FDTC 2010), IEEE-CS Press, USA, August 2010, pp 30–36Google Scholar
  16. 16.
    Wills KS, Lewis T, Billus G, Hoang H (1990) Optical beam induced current applications for failure analysis of VLSI devices. In: Proceedings International Symposium for Testing and Failure Analysis, 1990, p 21Google Scholar
  17. 17.
    Ajluni C (1995) Two new imaging techniques promise to improve IC defect identification. Electr Design 43(14): 37–38Google Scholar
  18. 18.
    Skorobogatov S, Anderson R (2002) Optical fault induction attacks. In: Cryptographic Hardware and Embedded Systems Workshop (CHES 2002), LNCS 2523, Springer-Verlag, Berlin, Heidelberg, New York, pp 2–12Google Scholar
  19. 19.
    Skorobogatov S (2009) Local heating attacks on flash memory devices. In: 2nd IEEE International Workshop on Hardware-Oriented Security and Trust (HOST-2009), San Francisco, CA, USA, IEEE Xplore, 27 July 2009Google Scholar
  20. 20.
    Skorobogatov S (2010) Flash memory ‘bumping’ attacks. Cryptographic Hardware and Embedded Systems Workshop (CHES 2010), LNCS 6225, Springer, Berlin, Heidelberg, New York, pp 158–172, August 2010Google Scholar
  21. 21.
    Skorobogatov S (2009) Using optical emission analysis for estimating contribution to power analysis. In: 6th Workshop on Fault Diagnosis and Tolerance in Cryptography (FDTC 2009), IEEE-CS Press, Switzerland, pp 111–119Google Scholar
  22. 22.
    Skorobogatov S (2006) Optically enhanced position-locked power analysis. Cryptographic Hardware and Embedded Systems Workshop (CHES 2006), LNCS 4249, Springer, Berlin, Heidelberg, New York, pp 61–75Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  1. 1.Computer LaboratoryUniversity of CambridgeCambridgeUK

Personalised recommendations