1 / 48

Secure Hardware Design

Secure Hardware Design. Secure Hardware Design. The Black Hat Briefings July 26-27, 2000. Brian Oblivion, Kingpin [oblivion, kingpin]@atstake.com. Why Secure Hardware?. Embedded systems now common in the industry Hardware tokens, smartcards, crypto accelerators, internet appliances

alia
Download Presentation

Secure Hardware Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Secure Hardware Design

  2. Secure Hardware Design The Black Hat Briefings July 26-27, 2000 Brian Oblivion, Kingpin [oblivion, kingpin]@atstake.com

  3. Why Secure Hardware? • Embedded systems now common in the industry • Hardware tokens, smartcards, crypto accelerators, internet appliances • Detailed analysis & reverse engineering techniques available to all • Increase difficulty of attack • The means exist

  4. Solid Development Process • Clearly identified design requirements • Identify risks in the life-cycle • Secure build environment • Hardware/Software Revision control • Verbose design documentation • Secure assembly and initialization facility • End-of-life recommendations • Identify single points of failure • Security fault analysis • Third-party design review

  5. Sources of Attack • Attacker resources and methods vary greatly Source: Cryptography Research, Inc. 1999, “Crypto Due Diligence”

  6. Accessibility to Product

  7. Attack Scenarios

  8. Attack Scenarios • System • Initial experimentation & probing • Viewed as a “black box” • Can be performed remotely • Bootstrapping attacks

  9. Attack Scenarios • Enclosure • Gaining access to product internals • Probing (X-ray, thermal imaging, optical) • Bypassing tamper-proofing mechanisms

  10. Attack Scenarios • Circuit • PCB design & parts placement analysis • Component substitution • Active bus and device probing • Fault induction attacks1 • Timing attacks2 • Integrated circuit die analysis3

  11. Attack Scenarios • Firmware • Low-level understanding of the product • Obtain & modify intellectual property • Bypass system security mechanisms • Ability to mask failure detection

  12. Attack Scenarios • Strictly Firmware - no product needed! • Obtain firmware from vendor’s public facing web site • Can be analyzed and disassembled without detection

  13. What Needs To Be Protected? • Firmware binaries • Boot sequence • Cryptographic functionality (offloaded to coprocessor) • Secret storage and management • Configuration and management communication channels

  14. System

  15. Trusted Base • Minimal functionality • Trusted base to verify the integrity on firmware and/or Operating System • Secure store for secrets • Secrets never leave the base unencrypted • Security Kernel • Examples of a Trusted Base • A single IC (some provide secure store for secrets) • May be purchased or custom built (Secure Coprocessor) • All Internals - circuit boards, components, etc. • Entire trusted base resides within tamper envelope • Firmware • Security Kernel

  16. Security Kernel • Better when implemented in Trusted Base, but can function in OS • Enforces the security policy • Ability to decouple secrets from OS Example: Cryptlib4

  17. Trusted Base example

  18. Failure Modes • Determine how the product handles failures • Fail-open or fail-closed? • Response depends on failure type • Halt system • Set failure flags and continue • Zeroization of critical areas

  19. Management Interfaces • Do not include service backdoors! • Utilize Access Control • Encrypt all management sessions • SSH for shell administration • SSL for web administration

  20. Firmware

  21. Secure Programming Practice • Code obfuscation & symbol stripping • Use compiler optimizations • Remove functionality not needed in production • Two versions of firmware: Development, Prod. • Remove symbol tables, debug info.

  22. Secure ProgrammingPractice • Buffer overflows5 • Highly publicized and attempted • If interfacing to PC, driver code with overflow could potentially lead to compromise

  23. Boot Sequence Trusted Boot Sequence

  24. Run-Time Diagnostics • Make sure device is 100% operational all the time • Periodic system checks • Failing device may result in compromise

  25. Secret Management • Never leak unencrypted secrets out • Escrow mechanisms are a security hazard • If required, perform at key generation, in the physical presence of humans • Physically export Key Encryption Key and protect • Export other keys encrypted with Key Encryption Key

  26. Cryptographic Functions • If possible, move out of firmware • …into ASIC • Difficult to modify algorithm • Cannot be upgraded easily • Increased performance • …into commercial CSOC or FPGA • Can reconfigure for other algorithms • May also provide key management • Increased Performance • Reconfiguration via signed download procedure (CSOC only)

  27. Field Programmability • Is your firmware accessible to everyone from your product support web page? • Encryption • Compressing the image is not secure • Encrypting code will limit exposure of intellectual property • Code signing • Reduce possibility of loading unauthorized code

  28. Circuit

  29. PCB Design • Remove unnecessary test points • Traces as short as possible • Differential lines parallel (even if on separate layers) • Separate analog, digital & power GND planes • Alternate power and GND planes

  30. Parts Placement • Difficult access to critical components • Proper power filtering circuit as close to input as possible • Noisy circuitry (i.e. inductors) compartmentalized

  31. Physical Access to Components • Epoxy encapsulation of critical components • Include detection mechanisms in and under epoxy boundary

  32. Power Supply & Clock Protection • Set min. & max. operating limits • Protect against intentional voltage variation • Watchdogs (ex: Maxim, Dallas Semi.) • dc-dc Converters, Regulators, Diodes • Monitor clock signals to detect variations

  33. I/O Port Properties • Use unused pins to detect probing or tampering (esp. for FPGAs) - Digital Honeypot • Disable all unused I/O pins

  34. Programmable Logic &Memory • Make use of on-chip security features • FPGA design • Make sure all conditions are covered • State machines should have default states in place • Be aware of what information is being stored in memory at all times6 (i.e. passwords, private keys, etc.) • Prevent back-powering of non-volatile memory devices

  35. Advanced Memory Management • Often implemented in small FPGA • Bounds checking in hardware • Execution, R/W restricted to defined memory • DMA restricted to specified areas only • Trigger response based on detection of “code probing” or error condition

  36. Bus Management • COMSEC Requirements • Keep black (encrypted) and red (in-the-clear) buses separate • Data leaving the device should always be black • Be aware of data on shared buses

  37. Enclosure

  38. Tamper Proofing • Resistance, Evidence, Detection, Response • Most effective when layered • Possibly bypassed with knowledge of method

  39. Tamper Proofing • Tamper Resistance • Hardened steel enclosures • Locks • Encapsulation, potting • Security screws • Tight airflow channels, 90o bends to prevent optical probing • Side-effect is tamper evident

  40. Tamper Proofing • Tamper Evidence • Major deterrent for minimal risk takers • Passive detectors - seals, tapes, cables • Special enclosure finishes • Most can be bypassed7

  41. Tamper Proofing • Tamper Detection • Ex:

  42. Tamper Proofing • Tamper Response • Result of tampering being detected • Zeroization of critical memory areas • Provide audit information

  43. RF, ESD Emissions & Immunity • Clean, properly filtered power supply • EMI Shielding • Coatings, sprays, housings • Electrostatic discharge protection • Could be injected by attacker to cause failures • Diodes, Transient Voltage Suppressor devices (i.e. Semtech)

  44. External Interfaces • Use caution if connecting to “outside world” • Protect against malformed, intentionally bad packets • Encrypt or (at least) obfuscate traffic • Be aware if interfaces provide access to internal bus • Control bus activity through transceivers • Attenuate signals which leak through transceivers with exposed buses (token interfaces) • Disable JTAG and diagnostic functionality in operational modes

  45. In Conclusion… As a designer: • Think as an attacker would • As design is in progress, allocate time to analyze and break product • Peer review • Third-party analysis • Be aware of latest attack methodologies & trends

  46. References • Maher, David P., “Fault Induction Attacks, Tamper Resistance, and Hostile Reverse Engineering in Perspective,” Financial Cryptography, February 1997, pp. 109-121 • Timing Attacks, Cryptography Research, Inc., http://www.cryptography.com/timingattack/ • Beck, F., “Integrated Circuit Failure Analysis: A Guide to Preparation Techniques,” John Wiley & Sons, Ltd., 1998 • Gutmann, P., Cryptlib, “The Design of a Cryptographic Security Architecture,” Usenix Security Symposium 1999, http://www.cs.auckland.ac.nz/~pgut001/cryptlib.html • Mudge, “Compromised Buffer Overflows, from Intel to SPARC version 8,” http://www.L0pht.com/advisories/bufitos.pdf • Gutmann, P., “Secure Deletion from Magnetic and Solid-State Memory Devices,” http://www.cs.auckland.cs.nz/~pgut001/secure_del.html • “Physical Security and Tamper-Indicating Devices,” http://www.asis.org/midyear-97/Proceedings/johnstons.html

  47. Additional Reading • DoD Trusted Computer System Evaluation Criteria (Orange Book), 5200.28-STD, December 1985, http://www.radium.ncsc.mil/tpep/library/rainbow/5200.28-STD.html • Clark, Andrew J., “Physical Protection of Cryptographic Devices,” Eurocrypt: Advances in Cryptography, April 1987, pp. 83-93 • Chaum, D., “Design Concepts for Tamper Responding Systems,” Crypto 1983, pp. 387-392 • Weingart, S.H., White, S.R., Arnold, W.C., Double, G.P., “An Evaluation System for the Physical Security of Computing Systems,” Sixth Annual Computer Security Applications Conference 1990, pp. 232-243 • Differential Power Analysis, Cryptography Research, Inc., http://www.cryptography.com/dpa/ • The Complete, Unofficial TEMPEST Information Page, http://www.eskimo.com/~joelm/tempest.html

  48. Thanks!

More Related