Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 48

Secure Hardware Design

Secure Hardware Design

The Black Hat Briefings


July 26-27, 2000

Brian Oblivion, Kingpin


[oblivion, kingpin]@atstake.com
Why Secure Hardware?
 Embedded systems now common in the industry
 Hardware tokens, smartcards, crypto accelerators,
internet appliances
 Detailed analysis & reverse engineering techniques
available to all
 Increase difficulty of attack
 The means exist
Solid Development Process
 Clearly identified design requirements
 Identify risks in the life-cycle
 Secure build environment
 Hardware/Software Revision control
 Verbose design documentation
 Secure assembly and initialization facility
 End-of-life recommendations
 Identify single points of failure
 Security fault analysis
 Third-party design review
Sources of Attack
 Attacker resources and methods vary greatly
Resource Teenager Academic Org. Crime Gov’t
Time Limited Moderate Large Large
Budget ($) <$1000 $10K-$100K $100K+ Unknown
Creativity Varies High Varies Varies
Detectability High High Low Low
Target Challenge Publicity Money Varies
Number Many Moderate Few Unknown
Organized No No Yes Yes
Spread info? Yes Yes Varies No

Source: Cryptography Research, Inc. 1999, “Crypto Due Diligence”


Accessibility to Product
Purchase All attacks possible

Evaluation Most attacks possible with


risk of detection

Active, in-service Most attacks possible

Remote access No physical access


Attack Scenarios

System

Enclosure

Circuit

Firmware
Attack Scenarios

 System

 Initial experimentation & probing


 Viewed as a “black box”
 Can be performed remotely
 Bootstrapping attacks
Attack Scenarios

 Enclosure

 Gaining access to product internals


 Probing (X-ray, thermal imaging, optical)
 Bypassing tamper-proofing mechanisms
Attack Scenarios
 Circuit

 PCB design & parts placement analysis


 Component substitution
 Active bus and device probing
 Fault induction attacks1
 Timing attacks2
 Integrated circuit die analysis3
Attack Scenarios

 Firmware

 Low-level understanding of the product


 Obtain & modify intellectual property
 Bypass system security mechanisms
 Ability to mask failure detection
Attack Scenarios

 Strictly Firmware - no product needed!

 Obtain firmware from vendor’s public facing


web site
 Can be analyzed and disassembled without
detection
What Needs To Be Protected?
 Firmware binaries
 Boot sequence
 Cryptographic functionality (offloaded to
coprocessor)
 Secret storage and management
 Configuration and management
communication channels
System
System

Firmware

Circuit

Enclosure
Trusted Base
System

 Minimal functionality
– Trusted base to verify the integrity on firmware and/or Operating System
– Secure store for secrets
– Secrets never leave the base unencrypted
– Security Kernel

 Examples of a Trusted Base


– A single IC (some provide secure store for secrets)
– May be purchased or custom built (Secure Coprocessor)
– All Internals - circuit boards, components, etc.
– Entire trusted base resides within tamper envelope
– Firmware
– Security Kernel
Security Kernel
System

 Better when implemented in Trusted Base,


but can function in OS

 Enforces the security policy


 Ability to decouple secrets from OS

Example: Cryptlib4
Trusted Base example
System

CSOC
External
Memory
Bus Data Memory
Bulk Communication
Control (may be dual ported.)
Transfer Interface(s)
Bus for bulk encrypt/decrypt
Bus

Host
Firmware
Memory
Mapped
Bus Main memory
Host
Processor (DRAM)
Failure Modes
System

 Determine how the product handles failures


 Fail-open or fail-closed?
 Response depends on failure type
 Halt system
 Set failure flags and continue
 Zeroization of critical areas
Management Interfaces
System

 Do not include service backdoors!


 Utilize Access Control
 Encrypt all management sessions
 SSH for shell administration
 SSL for web administration
Firmware
System

Firmware

Circuit

Enclosure
Secure Programming
Firmware

Practice
 Code obfuscation & symbol stripping
 Use compiler optimizations
 Remove functionality not needed in production
 Two versions of firmware: Development, Prod.
 Remove symbol tables, debug info.
Secure Programming
Firmware

Practice
 Buffer overflows5
 Highly publicized and attempted
 If interfacing to PC, driver code with overflow
could potentially lead to compromise
Boot Sequence
Firmware

Host System

Flash (BIOS)
FlashDisk or Fixed
Common (May be ROM) FlashDisk or Fixed
Disk
Disk
Boot Model
New or Overloaded
Embedded OS or
functionality Applications
state machine

Hardware
Time
Reset

Trusted Boot Sequence


CSOC Host System

Bootrom Flash FlashDisk or Fixed


Disk
FlashDisk or Fixed
POST, Security New or Overloaded
Common Kernel functionality Embedded OS or
Disk
Boot Model state machine
Applications
Verify Bootrom and Verify Embedded
Flash OS Verify Applications

Hardware
Time
Reset
Run-Time Diagnostics
Firmware

 Make sure device is 100% operational all the


time
 Periodic system checks
 Failing device may result in compromise
Secret Management
Firmware

 Never leak unencrypted secrets out


 Escrow mechanisms are a security hazard
 If required, perform at key generation, in the
physical presence of humans
 Physically export Key Encryption Key and protect
 Export other keys encrypted with Key Encryption
Key
Cryptographic Functions
Firmware

 If possible, move out of firmware


 …into ASIC
 Difficult to modify algorithm
 Cannot be upgraded easily
 Increased performance
 …into commercial CSOC or FPGA
 Can reconfigure for other algorithms
 May also provide key management
 Increased Performance
 Reconfiguration via signed download procedure
(CSOC only)
Field Programmability
Firmware

 Is your firmware accessible to everyone from


your product support web page?
 Encryption
 Compressing the image is not secure
 Encrypting code will limit exposure of intellectual
property
 Code signing
 Reduce possibility of loading unauthorized code
Circuit
System

Firmware

Circuit

Enclosure
PCB Design
Circuit

 Remove unnecessary test points


 Traces as short as possible
 Differential lines parallel (even if on separate
layers)
 Separate analog, digital & power GND planes
 Alternate power and GND planes
Parts Placement
Circuit

 Difficult access to critical components


 Proper power filtering circuit as close to input
as possible
 Noisy circuitry (i.e. inductors)
compartmentalized
Physical Access to
Circuit

Components
 Epoxy encapsulation of critical components

 Include detection mechanisms in and


under epoxy boundary
Power Supply & Clock
Circuit

Protection
 Set min. & max. operating limits
 Protect against intentional voltage variation
 Watchdogs (ex: Maxim, Dallas Semi.)
 dc-dc Converters, Regulators, Diodes
 Monitor clock signals to detect variations
I/O Port Properties
Circuit

 Use unused pins to detect probing or


tampering (esp. for FPGAs) - Digital Honeypot
 Disable all unused I/O pins
Programmable Logic &
Circuit

Memory
 Make use of on-chip security features
 FPGA design
 Make sure all conditions are covered
 State machines should have default states in place
 Be aware of what information is being stored in
memory at all times6 (i.e. passwords, private keys,
etc.)
 Prevent back-powering of non-volatile memory
devices
Advanced Memory Circuit

Management
 Often implemented in small FPGA
 Bounds checking in hardware
 Execution, R/W restricted to defined memory
 DMA restricted to specified areas only
 Trigger response based on detection of “code
probing” or error condition
Bus Management
Circuit

 COMSEC Requirements
 Keep black (encrypted) and red (in-the-clear)
buses separate
 Data leaving the device should always be black
 Be aware of data on shared buses
Enclosure
System

Firmware

Circuit

Enclosure
Tamper Proofing
Enclosure

 Resistance, Evidence, Detection, Response


 Most effective when layered
 Possibly bypassed with knowledge of method
Tamper Proofing
Enclosure

 Tamper Resistance

 Hardened steel enclosures


 Locks
 Encapsulation, potting
 Security screws
 Tight airflow channels, 90o bends to prevent optical
probing
 Side-effect is tamper evident
Tamper Proofing
Enclosure

 Tamper Evidence

 Major deterrent for minimal risk takers


 Passive detectors - seals, tapes, cables
 Special enclosure finishes
 Most can be bypassed7
Tamper Proofing
Enclosure

 Tamper Detection
 Ex:
Temperature sensors Radiation sensors
Micro-switches Magnetic switches
Nichrome wire Pressure contacts
Flex circuit Fiber optics
Tamper Proofing
Enclosure

 Tamper Response

 Result of tampering being detected


 Zeroization of critical memory areas
 Provide audit information
RF, ESD Emissions &
Enclosure

Immunity
 Clean, properly filtered power supply
 EMI Shielding
 Coatings, sprays, housings
 Electrostatic discharge protection
 Could be injected by attacker to cause failures
 Diodes, Transient Voltage Suppressor devices
(i.e. Semtech)
External Interfaces
Enclosure

 Use caution if connecting to “outside world”


 Protect against malformed, intentionally bad packets
 Encrypt or (at least) obfuscate traffic
 Be aware if interfaces provide access to internal bus
 Control bus activity through transceivers
 Attenuate signals which leak through transceivers with
exposed buses (token interfaces)
 Disable JTAG and diagnostic functionality in
operational modes
In Conclusion…

As a designer:
 Think as an attacker would
 As design is in progress, allocate time to
analyze and break product
 Peer review
 Third-party analysis
 Be aware of latest attack methodologies &
trends
References
1. Maher, David P., “Fault Induction Attacks, Tamper Resistance, and Hostile Reverse
Engineering in Perspective,” Financial Cryptography, February 1997, pp. 109-121
2. Timing Attacks, Cryptography Research, Inc.,
http://www.cryptography.com/timingattack/
3. Beck, F., “Integrated Circuit Failure Analysis: A Guide to Preparation Techniques,”
John Wiley & Sons, Ltd., 1998
4. Gutmann, P., Cryptlib, “The Design of a Cryptographic Security Architecture,”
Usenix Security Symposium 1999,
http://www.cs.auckland.ac.nz/~pgut001/cryptlib.html
5. Mudge, “Compromised Buffer Overflows, from Intel to SPARC version 8,”
http://www.L0pht.com/advisories/bufitos.pdf
6. Gutmann, P., “Secure Deletion from Magnetic and Solid-State Memory Devices,”
http://www.cs.auckland.cs.nz/~pgut001/secure_del.html
7. “Physical Security and Tamper-Indicating Devices,” http://www.asis.org/midyear-
97/Proceedings/johnstons.html
Additional Reading
1. DoD Trusted Computer System Evaluation Criteria (Orange Book),
5200.28-STD, December 1985,
http://www.radium.ncsc.mil/tpep/library/rainbow/5200.28-STD.html
2. Clark, Andrew J., “Physical Protection of Cryptographic Devices,”
Eurocrypt: Advances in Cryptography, April 1987, pp. 83-93
3. Chaum, D., “Design Concepts for Tamper Responding Systems,” Crypto
1983, pp. 387-392
4. Weingart, S.H., White, S.R., Arnold, W.C., Double, G.P., “An Evaluation
System for the Physical Security of Computing Systems,” Sixth Annual
Computer Security Applications Conference 1990, pp. 232-243
5. Differential Power Analysis, Cryptography Research, Inc.,
http://www.cryptography.com/dpa/
6. The Complete, Unofficial TEMPEST Information Page,
http://www.eskimo.com/~joelm/tempest.html
Thanks!

You might also like