Professional Documents
Culture Documents
Brian Oblivion
Brian Oblivion
System
Enclosure
Circuit
Firmware
Attack Scenarios
System
Enclosure
Firmware
Firmware
Circuit
Enclosure
Trusted Base
System
Minimal functionality
– Trusted base to verify the integrity on firmware and/or Operating System
– Secure store for secrets
– Secrets never leave the base unencrypted
– Security Kernel
Example: Cryptlib4
Trusted Base example
System
CSOC
External
Memory
Bus Data Memory
Bulk Communication
Control (may be dual ported.)
Transfer Interface(s)
Bus for bulk encrypt/decrypt
Bus
Host
Firmware
Memory
Mapped
Bus Main memory
Host
Processor (DRAM)
Failure Modes
System
Firmware
Circuit
Enclosure
Secure Programming
Firmware
Practice
Code obfuscation & symbol stripping
Use compiler optimizations
Remove functionality not needed in production
Two versions of firmware: Development, Prod.
Remove symbol tables, debug info.
Secure Programming
Firmware
Practice
Buffer overflows5
Highly publicized and attempted
If interfacing to PC, driver code with overflow
could potentially lead to compromise
Boot Sequence
Firmware
Host System
Flash (BIOS)
FlashDisk or Fixed
Common (May be ROM) FlashDisk or Fixed
Disk
Disk
Boot Model
New or Overloaded
Embedded OS or
functionality Applications
state machine
Hardware
Time
Reset
Hardware
Time
Reset
Run-Time Diagnostics
Firmware
Firmware
Circuit
Enclosure
PCB Design
Circuit
Components
Epoxy encapsulation of critical components
Protection
Set min. & max. operating limits
Protect against intentional voltage variation
Watchdogs (ex: Maxim, Dallas Semi.)
dc-dc Converters, Regulators, Diodes
Monitor clock signals to detect variations
I/O Port Properties
Circuit
Memory
Make use of on-chip security features
FPGA design
Make sure all conditions are covered
State machines should have default states in place
Be aware of what information is being stored in
memory at all times6 (i.e. passwords, private keys,
etc.)
Prevent back-powering of non-volatile memory
devices
Advanced Memory Circuit
Management
Often implemented in small FPGA
Bounds checking in hardware
Execution, R/W restricted to defined memory
DMA restricted to specified areas only
Trigger response based on detection of “code
probing” or error condition
Bus Management
Circuit
COMSEC Requirements
Keep black (encrypted) and red (in-the-clear)
buses separate
Data leaving the device should always be black
Be aware of data on shared buses
Enclosure
System
Firmware
Circuit
Enclosure
Tamper Proofing
Enclosure
Tamper Resistance
Tamper Evidence
Tamper Detection
Ex:
Temperature sensors Radiation sensors
Micro-switches Magnetic switches
Nichrome wire Pressure contacts
Flex circuit Fiber optics
Tamper Proofing
Enclosure
Tamper Response
Immunity
Clean, properly filtered power supply
EMI Shielding
Coatings, sprays, housings
Electrostatic discharge protection
Could be injected by attacker to cause failures
Diodes, Transient Voltage Suppressor devices
(i.e. Semtech)
External Interfaces
Enclosure
As a designer:
Think as an attacker would
As design is in progress, allocate time to
analyze and break product
Peer review
Third-party analysis
Be aware of latest attack methodologies &
trends
References
1. Maher, David P., “Fault Induction Attacks, Tamper Resistance, and Hostile Reverse
Engineering in Perspective,” Financial Cryptography, February 1997, pp. 109-121
2. Timing Attacks, Cryptography Research, Inc.,
http://www.cryptography.com/timingattack/
3. Beck, F., “Integrated Circuit Failure Analysis: A Guide to Preparation Techniques,”
John Wiley & Sons, Ltd., 1998
4. Gutmann, P., Cryptlib, “The Design of a Cryptographic Security Architecture,”
Usenix Security Symposium 1999,
http://www.cs.auckland.ac.nz/~pgut001/cryptlib.html
5. Mudge, “Compromised Buffer Overflows, from Intel to SPARC version 8,”
http://www.L0pht.com/advisories/bufitos.pdf
6. Gutmann, P., “Secure Deletion from Magnetic and Solid-State Memory Devices,”
http://www.cs.auckland.cs.nz/~pgut001/secure_del.html
7. “Physical Security and Tamper-Indicating Devices,” http://www.asis.org/midyear-
97/Proceedings/johnstons.html
Additional Reading
1. DoD Trusted Computer System Evaluation Criteria (Orange Book),
5200.28-STD, December 1985,
http://www.radium.ncsc.mil/tpep/library/rainbow/5200.28-STD.html
2. Clark, Andrew J., “Physical Protection of Cryptographic Devices,”
Eurocrypt: Advances in Cryptography, April 1987, pp. 83-93
3. Chaum, D., “Design Concepts for Tamper Responding Systems,” Crypto
1983, pp. 387-392
4. Weingart, S.H., White, S.R., Arnold, W.C., Double, G.P., “An Evaluation
System for the Physical Security of Computing Systems,” Sixth Annual
Computer Security Applications Conference 1990, pp. 232-243
5. Differential Power Analysis, Cryptography Research, Inc.,
http://www.cryptography.com/dpa/
6. The Complete, Unofficial TEMPEST Information Page,
http://www.eskimo.com/~joelm/tempest.html
Thanks!