Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 141

Design Verification for SoC

晶片系統之設計驗證

熊博安
國立中正大學資訊工程學系
嵌入式系統實驗室
hpa@computer.org
http://www.cs.ccu.edu.tw/~pahsiung/
http://embedded.cs.ccu.edu.tw/
Contents
 SoC Verification Challenges
 Verification Methods
 Simulation Technologies
 Static Technologies
 Formal Technologies
 Physical Verification and Analysis
 SoC Verification Methodologies
 Case Studies
2
What is a System-on-Chip?
 An SoC contains:
 Portable / reusable
IP
 Embedded CPU
 Embedded Memory
 Real World
Interfaces (USB,
PCI, Ethernet)
 Software (both on-
chip and off)
 Mixed-signal Blocks
 Programmable HW
(FPGAs)
 > 500K gates
 Technology: 0.25um
and below
 Not an ASIC !
3
Challenges for System-on-Chip
Industry
“ ... the industry is just beginning to fathom the
scope of the challenges confronting those who
integrate blocks of reusable IP on large chips.
Most of the participants summed up the
toughest challenge in one word: verification.”

Source: EE Times (Jan. 20, 1997)


Report on Design Reuse and IP Core Workshop
Organized by DARPA, EDA Industry Council, NIST

4
System-on-Chip Verification
Challenges
 Verification goals
 functionality, timing, performance, power,
physical
 Design complexity
 MPUs, MCUs, DSPs, AMS IPs, ESW,
clock/power distribution, test structures,
interface, telecom, multimedia

5
System-on-Chip Verification
Challenges
 Diversity of blocks (IPs/Cores)
 different vendors
 soft, firm, hard
 digital, analog, synchronous,
asynchronous
 different modeling and description
languages - C, Verilog, VHDL
 software, firmware, hardware
 Different phases in system design
flow
 specification validation, algorithmic,
architectural, hw/sw, full timing,
prototype 6
Finding/fixing bugs costs in
the verification process
System
Time to
fix a
bug

Module

Block

Design integration
stage
• Increase in chip NREs make respins an unaffordable proposition
RESPIN • Average ASIC NRE ~$122,000
• SoC NREs range from $300,000 to $1,000,000 NRE=non-recurring engineering 7
Challenges in DSM technology
for SoC
 Timing Closure
 Sensitive to interconnect delays
 Large Capacity
 Hierarchical design and design reuse
 Physical Properties
 Signal integrity
(crosstalk, IR drop, power/ground bounce)
 Design integrity
(electron migration, hot electron, wire self-
heating)
8
Physical issues verification (DSM)

 Interconnects
 Signal Integrity
 P/G integrity
 Substrate coupling
 Crosstalk
 Parasitic Extraction
 Reduced Order Modeling
 Manufacturability and
Reliability
 Power Estimation

9
Physical issues verification (DSM)
Interconnects

 Scaling technology
 They get longer and longer
 Increasing complexity
 New materials for low resistivity
 Inductance and capacitance become more relevant

 Larger and larger impact on the design

 Need to model them and include them in


the design choices
(gate-centric to interconnect-centric
paradigm)
10
Physical issues verification (DSM)
P/G and Substrate

 Analog and Digital blocks may share supply network


and substrate
 Can I just plug them together on the same chip? Will
it work?
 The switching activity of digital blocks injects noise
current that may “kill” analog sensitive blocks

Digital IP

Analog
11
Physical issues verification (DSM)
Crosstalk

In DSM technologies, coupling capacitance dominates


interlayer capacitance

 there is a “bridge” between interconnects on the


same layer….they interfere with each other!

12
Physical issues verification (DSM)
Parasitic Extraction

 Parasitics play a major role in DSM


technologies

 Need to properly extract their value and model


13
Physical issues verification (DSM)
Reduced Order Modeling

 Increasing complexity  bigger and more


complex models
 E.g. supply grid, parasitics…

 Need to find a “reduced” model so that


 Still good representation
 Manageable size

14
Physical issues verification (DSM)
Manufacturability

 Design a chip
 Send it to fabrication
 …….
 Did I account for the fabrication process
variations?
 How many of my chips will work?
 Just one? All? Most of them?
 How good is my chips performance?

Design and verification need to account for


process variations!
15
Physical issues verification (DSM)
Reliability

 Design a chip
 Send it to fabrication
 …….
 Did I test my design for
different kinds of stress?
 Is it going to work even in
the worst case?
 Can I sell it both in Alaska
and Louisiana?

16
Physical issues verification (DSM)
Power Estimation

 Advent of portable and high-density circuits


 power dissipation of VLSI circuits becomes a
critical concern

Accurate and efficient power


estimation techniques are
required
17
Design Productivity Gap

Gates / Chip

Design Gates / Hour


Productivity Gap

1990 1995 2000

18
SoC Design/Verification Gap
S
Test y
Simulation Performance

s
Complexity
t
e
m
-
o
Simulator n
Performance -
C
Verification h
Gap i
p

Design Complexity (FFs)


Source: Cadence
19
Verification Methods
 Simulation Technologies
 Static Technologies
 Formal Technologies
 Physical Verification and Analysis

20
Simulation Technologies
 Event-driven Simulators
 Cycle-based Simulators
 Rapid Prototyping Systems
 Emulation Systems
 Speeding up Simulators (C, BFM, ISS,…)
 Testing & Coverage-driven Verification
 Assertion-based Verification
 HW/SW Cosimulation
 AMS Modeling and Simulation
21
Hardware Simulation
 Event-driven
 compiled code
 native compiled code (directly producing optimized object

code)
 very slow
+ asynchronous circuits, timing verification, initialize to
known state
S S
 Cycle-based t
C L
t
o o
+ faster (3-10x than NCC) a m g a
 synchronous design, no timing b. i
t c t
verification, e e
cannot handle x,z states
clock
22
Simulation: Perfomance vs
Abstraction

Cycle-based
Simulator
Event-driven
Simulator
Abstraction

SPICE

.001x 1x 10x
Performance and Capacity 23
Validating System-on-Chip
by Simulation
 Need for both cycle-based and event-driven
 asynchronous interfaces
 verification of initialization
 verification of buses, timing
 Need for mixed VHDL/Verilog simulators
 IP from various vendors
 models in different languages

 SoC verification not possible by


current simulation tools
 Growing gap between amount of
verification desired and amount that
can be done
 1 million times more simulation
load than chip in 1990 (Synopsys)
24
Rapid Prototyping Systems
Firmware Design
Source Object
Algorithm
Code Code

In Circuit
Emulator(ICE)
Integration Test

Behavior RTL GATE

Hardware Design

25
Emulation Systems
 Emulation: Imitation of all or parts of the target system by another
system, the target system performance achieved primarily by hardware
implementation
 In-Circuit Emulator (ICE): A box of hardware that can emulate the
processor in the target system. The ICE can execute code in the target
system’s memory or a code that is downloaded to emulator.
 ICE also can be fabricated as silicon within the processor-core: provides
interface between a source level debugger and a processor embedded
within an ASIC
 Provides real-time emulation
 Supports functions such as breakpoint setting, single step execution, trace
display and performance analysis
 Provide C-source debugger
 Examples: EmbeddedICE macrocell in ARM SY7TDM1, NEC 850 family of
processors, LSI Logic
26
Embedded ICE Macrocell
2
EmbeddedICE
Macrocell EmbeddedICE
ARM Core
Macrocell
0

Control

1 Data ARM7TDMI Addr Traditional


Data bus boundary scan
scan chain
TAP

5 pin JTAG Source: ARM

Interface 27
Embedded ICE in ARM7TDMI
Core

EmbeddedICE Interface

ASIC
EmbeddedICE macrocell

Debug Host ARM

Source: ARM

28
Debugging environment
for CPU core
Target system
IE-784000-R
IN CIRCUIT EMULATOR
Bread board for emulator

In-Circuit Emulator
Create by using
standard LSI , FPGA
& G/A
Source: NECEL

29
Enhancing Simulation Speed
Using Simulation Models
 Hardware model
 Behavioral model in C
 Bus-functional model (BFM)
 Instruction-Set simulation (ISS) model
 instruction accurate
 cycle accurate
 Full-timing gate-level model
 encrypted to protect IP
30
Hardware Model
 Use the actual physical device to model its
own behavior during simulation
 Advantages: accuracy, full device
functionality, including any undocumented
behavior
 Disadvantages: delivers 1 to 10
instructions/sec, cost
 Example
 Logic Modeling (Synopsys) Hardware Models
31
Behavioral Model
 Behavior of the core modeled in C
 Example: Memory models from Denali
 30-70% of system chip area is memory => power, latency,
area of chip
 In typical simulation, conventional models consume as much as
90% of workstation memory
 C models of DRAM, SRAM, Flash, PROM, SDRAM, EEPROM,
FIFO
 RAMBUS, Configurable Cache
 parameterizable models, common interface to all simulators
 allows adaptive dynamic allocation, memory specific debugging
32
Bus Functional Model (BFM)
 Idea is to remove the application code and the target processor
from the hardware simulation environment
 Performance gains by using host processor’s capabilities instead
of simulating same operation happening on target processor
 Varying degrees of use of host processor leads to different
models
 Bus functional model
 only models the interface circuitry (bus), no internal functionality
 usually driven by commands: read, write, interrupt, …
 bus-transaction commands converted into a timed sequence of
signal transitions: fed as events to traditional hardware simulator
 Bus functional model emulates “transactions”:
 Read/Write Cycles (single/burst transfers)
 Interrupts
33
Compiled Code Simulation
I/O Bus
Compile to transactions Bus functional Events Hardware
App code
host processor model Simulator

 Host code not equal to Target code


 Low-level debugging not possible
 E.g. observing processor internal registers
 Measurements may be inaccurate
 E.g. cycle counts
34
Instruction Set Simulation
(ISS)
 Full functional accuracy of the processor as viewed from pins
 Operations of CPU modeled at the register/instruction level
 registers as program variables
 instructions as program functions which operate on register values
 Data path that connects the registers are abstracted out
 Allows both high-level and assembly code to be debugged
 Instruction Accurate
 accurate at instruction boundaries only
 correct bus operations, and total number of cycles, but no guarantee
of state of CPU at each clock cycle; inaccuracy due to bus contention
 Cycle Accurate
 guarantees the state of the CPU at every clock cycle
 guarantees exact bus behavior
 slower than instruction-accurate, but faster than full behavioral model
Source: LSI Logic, Mentor Graphics
35
ISS Example
 Example Simulator: Microtec XRAY Sim™
 Fast: 100,000 instructions/sec
 Software debug: source code debugging,
register and memory views

Register view Memory view


Source-level debug
36
Example of Simulation Models
NEC provides the following simulation models:
 Behavioral C model: used in early design stage, for functional
verification, fastest execution
 RTL model with timing wrapper: for accurate timing and function
verification
 Verilog gate-level model: for final design verification, very slow

Timing Wrapper RAM


V851
Soft ICE
in
(for emulation & debugging)
C - Model

Verilog Interface User


Logic
ROM
Verilog

37
Testing
 Verification environment
 Commonly referred as testbench
 Definition of a testbench
 A verification environment containing a set of
components [such as bus functional models
(BFMs), bus monitors, memory modules] and the
interconnect of such components with the design
under-verification (DUV)
 Verification (test) suites (stimuli, patterns,
vectors)
 Test signals and the expected response under
given testbenches
38
Testbench Design
 Auto or semi-auto stimulus generator is
preferred
 Automatic response checking is highly
recommended
 May be designed with the following
techniques
 Testbench in HDL
 Testbench in programming language interface
(PLI)
 Waveform-based
 Transaction-based
 Specification-based
39
Types of Verification Tests
 Random testing
 Try to create scenarios that engineers do not
anticipate
 Functional testing
 User-provided functional patterns
 Compliances testing
 Corner case testing
 Real code testing (application SW)
 Avoid misunderstanding the specification
40
Types of Verification Tests
 Regression testing
 Ensure that fixing a bug will not introduce
other bugs
 Regression test system should be
automated
 Add new tests
 Check results and generate report
 Distribute simulation over multiple computer
 Time-consuming process when verification
suites become large
41
Coverage-driven Verification
 Coverage reports can indicate how much of
the design has been exercised
 Point out what areas need additional verification
 Optimize regression suite runs
 Redundancy removal (to minimize the test suites)
 Minimizes the use of simulation resources
 Quantitative sign-off (the end of verification
process) criterion
 Verify more but simulate less

42
The rate of bug detection

bugs

source : “Verification Methodology Manual For Code Coverage In HDL Designs ” by Dempster and Stuart

43
Coverage Analysis
 Dedicated tools are required besides the simulator
 Several commercial tools for measuring Verilog and
VHDL code coverage are available
 VCS (Synopsys)
 NC-Sim (Cadence)
 Verification navigator (TransEDA)
 Basic idea is to monitor the actions during simulation
 Require supports from the simulator
 PLI (programming language interface)
 VCD (value change dump) files

44
Analysis Results
 Verification Navigator (TransEDA)

Untested code line will be highlighted


45
Assertion-Based Verification
 Assertion-based verification (ABV) solutions are
gaining in popularity
 Assertions are statements of designer assumptions or design
intent
 Assertions should be inherently reusable
 These supplement, not replace, traditional simulation
tests
 Design and verification engineer knowledge is leveraged
 Both design observability and design controllability can be
improved
 Assertions enable formal verification

46
Assertions
 Assertions can capture interface and bus rules
 In ABV, protocol monitors are written using assertions
 Each individual protocol rule is captured by an assertion,
usually temporal (multi-cycle) in nature
 Example: Signal A should assert between 3 and 5 cycles
after signal B, but only if signal C is deasserted
 Internal assertions capture design intent
 Example: this FIFO should never receive a write when it is
already full
 Example: this state machine variable should always be
encoded as one-hot
 These improve observability in simulation but still rely on
tests for stimulus
47
Assertion Checkers
 Checkers check the assertion conditions
 Checker “fire” on indications of bugs (e.g., FIFO write when
full)
 Improve observability but still rely on tests for stimulus
 Certain types of checkers can be inferred directly
from the RTL code
 Example: Arithmetic overflow on a computation
 Example: Proper synchronization across an asynchronous
clock domain boundary
 The most valuable assertion checkers come from
information in the designer’s head
 It must be easy to capture assertions

48
Assertion Capture
Checkers can be written in many ways
 In a testbench language (C, C++, e, Vera, etc.)
 Directly in RTL (Verilog or VHDL)
 With RTL assertion constructs (VHDL, SystemVerilog)
 Using assertion libraries (OVL)
 In a formal property language (PSL, Sugar, ForSpec, etc.)
 Embedded in RTL with pseudo-comments (used by
0-In and several other EDA vendors)
 Assertion capture should be as easy as possible
 Designers don’t want to learn a new language
 Assertion checker libraries provide a lot of leverage

49
Complete ABV Flow
RTL Automatic
RTLDesign
Design
RTL Checks

Assertions Assertion
Assertion Assertion
Library
Library
Compiler

Standard Coverage
Coverage
Simulation Testbench
Testbench StandardVerilog
Verilog Reports
Simulator
Simulator Reports

Formal Model
Compiler
Formal
Verification
Formal
Formal
Static Formal Dynamic Formal Metrics
Metrics
Verification Verification

50
IP Verification with Checkers
test
test test
test
test
test test
test
test
test test
test
checker test IP test
test test
test
test test
test
test
test test
test
Directed Tests Verification Environment Directed Tests
Standard Interface Application
Interface
 Use checkers during interface IP creation
 Saturate RTL code with checkers
 Use checkers on interfaces as trip-wires
 Report illegal inputs and scenarios not handled
 Deliver IP with assertion checkers included
51
SoC Integration with Checkers
custom decode
IP interface
test
test
Standard logic
test
test
Interconnect on-chip bus test
test
ctl1 test
test

CPU System Regression


RAM
checker

 Checkers accelerate SoC integration


 Ensure that standard protocol is never violated
 Detect illegal inputs or invalid assumptions by user
 Improve observability in SoC simulation
 Speed up bug discovery and diagnosis

52
Hardware-Software Co-Simulation

High Processor Low


Processor Model & Hardware
Hardware
Model Only
Memory I/O
cycles

 Most of the bus cycles are • High Activity • Low


Instruction or Data fetches • 700-1000 Activity
instructions • Only during
for each I/O processor
bus cycle I/O cycles

53
Hardware-Software Co-Simulation:
Implementation
Memory & Signal
Synchronization

HDL model
Application ISS
Program Processor Bus
(Assembly) model functional
model

Host Memory
memory model

54
Seamless CVE™: Comprehensive
System Wide Analysis & Debug
XRAY™ Sim Seamless CVE™ Logic Simulator

Synchronization
& Optimization

Memory Image
Server

Source: Mentor Graphics

55
Analog Behavior Modeling
 A mathematical model written in Hardware
Description Language
 Emulate circuit block functionality by sensing
and responding to circuit conditions
 Available Analog/Mixed-Signal HDL:
 Verilog-A
 VHDL-A
 Verilog-AMS
 VHDL-AMS
56
Mixed Signal Simulation

57
Static Technologies
 Inspection and Lint Checking
 Static Timing Analysis

58
Inspection & Lint Checking
 For designers, finding bugs by careful
inspection is often faster than that by
simulation
 Inspection process
 Design (specification, architecture) review
 Code (implementation) review
 Line-by-line fashion
 At the sub-block level
 Lint-liked tools can help spot defects without
simulation
 Nova ExploreRTL, VN-Check, ProVerilog, …

59
HDL Linter
 Fast static RTL code checker
 Preprocessor of the synthesizer
 RTL purification (RTL DRC)
 Syntax, semantics, simulation
 Check for built-in or user-specified rules
 Testability checks
 Reusability checks
 ……
 Shorten design cycle
 Avoid error code that increases design iterations

60
Static Timing Analysis (STA)
 STA is a method for determining if a circuit meets
timing constraints (setup, hold, delay) without having
to simulate
 No input patterns are required
 100% coverage if applicable
 Challenging: multiple sources

Reference :Synopsys
61
Formal Technologies
 Formal Verification: An analytic way of
proving a system correct
 no simulation triggers, stimuli, inputs
Formal
 no test-benches, test-vectors, test-cases
Verification
Methods
 Deductive Reasoning (theorem proving)
 Model Checking
 Equivalence Checking
62
Theorem Proving
 Uses axioms, rules to prove system
correctness
 No guarantee that it will terminate
 Difficult, time consuming: for critical
applications only
 Not fully automatic
63
Model Checking
 Automatic technique to prove
correctness of concurrent systems:
 Digital circuits
 Communication protocols
 Real-time systems
 Embedded systems
 Control-oriented systems
 Explicit algorithms for verification
64
Equivalence Checking
 Checks if two circuits are equivalent
 Register-Transfer Level (RTL)
 Gate Level
 Reports differences between the two
 Used after:
 clock tree synthesis
 scan chain insertion
 manual modifications
65
Why Formal Verification?
 Simulation and test cannot handle all
possible cases (only some possible ones)
 Simulation and test can prove the presence
of bugs, rather than their absence
 Formal verification conducts exhaustive
exploration of all possible behaviors
 If verified correct, all behaviors are verified
 If verified incorrect, a counter-example (proof)
is presented
66
Why Formal Verification Now?
 SoC has a high system complexity
 Simulation and test are taking
unacceptable amounts of time
 More time and efforts devoted to
verification (40% ~ 70%) than design
 Need automated verification methods
for integration into design process

67
Increased Simulation Loads

68
Why Formal Verification Now?
Examples of undetected errors
 Ariane 5 rocket explosion, 1996
 Exception occurred when converting 64-bit
floating number to a 16-bit integer!
 Pentium FDIV bug
 Multiplier table not fully verified!

69
70
Verification Tasks for SoC

71
Property Checking v/s
Equivalence Checking

72
Model (Property) Checking
 Algorithmic method of verifying
correctness
 of (finite state) concurrent systems
 against temporal logic specifications
 A practical approach to formal
verification

73
Model Checking
What is necessary for Model Checking?
 A mathematically precise model of the
system
 A language to state system properties
 A method to check if the system
satisfies the given properties

74
Formal Verification Issues
 State-space explosion!!!
 Cannot handle large systems!
 For control-oriented behavior of small
modules
 For interface-centric verification
 Constrained for feasible verification
 Supplementary to simulation
 Counterexample  simulation trace

75
Physical Verification & Analysis
Issues for physical verification:
 Timing

 Signal Integrity

 Crosstalk

 IR drop

 Electro-migration

 Power analysis

 Process antenna effects

 Phase shift mask

 Optical proximity correction


76
Comparing Verification
Options

77
Comparing HW/SW
Coverification Options

78
Which is the fastest option?
 Event-based simulation
 Best for asynchronous small designs
 Cycle-based simulation
 Best for medium-sized designs
 Formal verification
 Best for control-oriented designs
 Emulation
 Best for large capacity designs
 Rapid Prototype
 Best for software development
79
SoC Verification Methodology
 System-Level Verification
 SoC Hardware RTL Verification
 SoC Software Verification
 Netlist Verification
 Physical Verification
 Device Test
80
SoC Verification Methodology

81
Verification Approaches
 Top-Down Verification
 Bottom-Up Verification
 Platform-Based Verification
 System Interface-Driven Verification

82
Top-Down SoC Verification

verification
83
Bottom-Up SoC Verification
Components,
blocks, units

Memory map,
verification

internal interconnect

Basic functionality,
external interconnect

System level

84
Platform-Based SoC
Verification
Derivative
Design

Interconnect
Verification
between:
 SoC Platform
 Newly added
IPs
85
System Interface-driven
SoC Verification

Besides Design-Under-Test,
all others are interface models
86
System-on-Chip Design and Validation Flow
SYSTEM SPEC VALIDATION

Cores MAPPING / PARTITIONING ESTIMATORS


- TIMING
MPU, MCU,DSP - POWER C-MODELS, ALGORITHMIC
Peripherals HDL- CO-VALIDATION
MODELS
Interface
S/W TASKS
Multimedia TASKS
HARD SOFT
ON
CORES CORES
Telecom/ CPU UDL
Networking

S/W H/W CYCLE


IMPLEMENTATION SYNTHESIS BASED ARCHITECTURAL
- ISS CO-VALIDATION
- HDL
DFT & TEST ASIC BUS
DFT & TEST
GENERATION INTEGRATION ARCHITECTURE
GENERATION

CPU Co- UDL FULL


CORE FULL
proc. TIMING TIMING
MODELS VERIFICATION
ROM/ PCI/
RAM Periph.
MPEG

PROTOTYPE
ASIC PROTOTYPE VERIFICATION
87
Embedded Software Implementation and Validation
Software Tasks Estimators
- Performance
- Power

Mapping tasks to CPUs Instruction Set


Compiler Simulator
Assembler
Linker Multitask Scheduling
- Priority selection Co-Simulator H /W

Multiprocessor Integration Debugger


RTOS
- Protocols Emulator
- Shared Memory

Software Implementation
88
Verification of Cores in High Level Design Flow

user constraints:
resource, performance, etc.

Behavioral Functional RTL Structural RTL


Hardware
VHDL CFG Scheduler RT-Level
Compiler Scheduled Sharing (contr + DP)
Specs. DFG (cycle-by-cycle VHDL Delay / Power/
behavior) Testability

RT-Level
Test Bench Optimization
Generation Estimators

Optimized
RTL

Verification Mapping,
- using Test Bench Physical
- Formal Synthesis
89
Integration of Cores: Verification of Interfaces

CPU DMA Peripheral Peripheral

External AHB APB


Bus Bridge
Interface

ROM Peripheral Peripheral


RAM

Ext Access High Speed Low power


(Test)
Source: ARM

AMBA: Advanced Microprocessor Bus Architecture


90
Device Test
 To check if devices are manufactured
defect-free
 Focus on structure of chip
 Wire connections
 Gate truth tables
 Not functionality

91
Device Test
Challenges in SoC device test:
 Test Vectors: Enormous!
 Core Forms: soft, firm, hard, diff tests
 Cores: logic, mem, AMS, …
 Accessibility: very difficult / expensive!

92
Device Test Strategies
 Logic BIST (Built-In-Self-Test)
 Stimulus generators embedded
 Response verifiers embedded
 Memory BIST
 On-chip address generator
 Data generator
 Read/write controller (mem test algorithm)
 Mixed-Signal BIST
 For AMS cores: ADC, DAC, PLL
 Scan Chain
 Timing and Structural compliance
 ATPG tools generate manufacturing tests automatically
 IEEE P1500 SECT (next time!) 93
Case Studies
 ALCATEL Image Compression System
 “Designing an Image Compression System
for a Space Application, Using an IP Based
Approach,” Jean Marie Garigue, Emmanuel
Liegeon, Sophie Di Santo, Louis Baguena,
IP Based Design Conference, 2000.

 VisorVoice Product

94
ALCATEL Image Compression
System
 Design and implement an image
compression subsystem for an
observation satellite
 Observation Satellite Payload
 Image sensors
 Image compression system
 Storage (solid state recorder)
 Transmitter

95
Top Level Requirements
 Accept image data from the space craft
image acquisition unit
 Compress the image data
 Send compressed data to the image
transmission system or on board compressed
image storage system
 Develop a complete solution compact enough
to be integrated with the Solid State Recorder
 Requires an SoC implementation of the
compression system

96
Refined Requirements
 Accept rough digital video data from the
sensors through serial transceivers
 Compress the image with a rate ranging
from 1.2 to 40
 Use a JPEG based compression algorithm
 Optimize the image quality
 Transmit the data to the solid state
recorder and transmitter at a fixed but
programmable rate

97
Specifications
 8 video channels
 10 bits/pixel
 Input: 25 Mpixels/sec – on a serial interface
 Video line size: up to 16k pixels
 Output: 25 Mbytes/sec maximum
 Temperature range: -55°C +125°C
 Space environment: total dose –10k rad, SEU - Single
Event Upset – changing content of FF’s or memory
 Power budget: 6W/channel
 Size/weight: to be minimized

98
Constraints
 Must meet European Space Agency
standards
 Must meet ALCATEL standards
 Time constraint: final product in 18
months
 Cost constraint: fixed amount not to be
exceeded

99
Input Interface Unit
 Serial Data Link
 25 Mpixels/sec
 hardware implementation
 Requires conversion from analog to CMOS signals and reformatting

 Analog mixed signal block implementation


 Decision
 Implement only the data rearrangement subsystem on the chip

100
DCT Unit

 DCT Calculation
 Converts a waveform (input pixels of a block) into its
frequency components
 Weighting operation allows different quantification factors for
each of the DCT coefficients
 25 Mpixels/sec 
 hardware implementation
101
Huffman Unit

 Quantification
 Calculation requires a complex series of operations
software implementation
 Run length encode DCT coefficients
 Huffman encoding - table lookup
 Packing
 25 Mpixels/sec rate  hardware implementation
102
Regulation Unit
 Principal - Alcatel patented algorithm
 Status of the regulation buffer is monitored
 Quantification factor is calculated as a
complex function of the level of the buffer
and the distribution of the DCT coefficients
 The distribution is obtained by constructing a
histogram of the coefficients
 25 Mpixel Rate  hardware
implementation
103
SoC System Architecture

104
Initial Design
 At time of initial design required using
0.5 µm technology and special design
techniques
 The system could not be implemented as a
SoC – a multichip version was implemented

105
Initial System

106
SOC Design
 After 0.35 µm was rated, an SoC
version was implemented
 Example of value of internal IP reuse

107
SoC Verification Methodology
 Step One
 Complete system was modeled in ‘C’
 Algorithms were “tuned”
 Compression – to include the quantification factor
 Regulation – to be based on a patented ALCATEL
algorithm
 System performance was checked using a set of
“reference” images
 Enabled system designers to gain concurrence of
the “customer” on a set of reference images

108
SoC Verification Methodology -
continued
 Step 2
 Each function checked against the
corresponding intermediate results of the
‘C’ model
 Step 3
 Complete system was checked against the
‘C’ model

109
System Verification
Alternatives
 Simulation
 Emulation
 FPGA prototypes

110
Verification
 Individual functions were simulated
 Simulation of entire system to process
an image (6 Mpixels) – estimate: 1000
hours
 Hence, an alternate emulation based
approach was adopted
 Based on a CELARO machine
 Can emulate 4M gates - enabled the complete
SoC to be mapped into the emulator

111
Emulation
 SoC was mapped after synthesis
 ALCATEL developed a translation of the
ASIC library to the CELARO library
 Enabled checking not only behavior but
also physical implementation after
synthesis
 Behavioral VHDL models of the
memories and microcontroller (DSP)
were implemented in the workstation
112
Emulation Result
 Emulation enabled fast system gate
level verification
 1 hour 40 minutes for processing 6M Pixels
by a 1.1M gate chip at the gate level vs.
1000 hours of simulation time

113
VisorVoice - Informal Product
Description

114
VisorVoice
 A device that lets us:
 Record digital messages
 Replay the messages
 Download / upload the
messages to and from a
desktop computer
 Interoperates with a
handheld PDA

115
VisorVoice Functionality
 What does the VisorVoice need to do?
 Record a message
 Play previously recorded message
 Delete a message
 Inserts into Visor Expansion slot
 Hot-Sync with PC

116
VisorVoice Performance
 Record a meeting,
record notes
 A typical meeting:
60 - 120 minutes
 Recording notes: 2
minutes each, 60
maximum
 Speech quality: has
to be good enough

117
VisorVoice Performance
 Battery operated,
portable
 Battery has to last a
minimum of 120
minutes
 May want longer
battery life to permit
playing back
messages
 When battery fails,
the messages should
not be lost

118
Can We Do It for $50?
 VisorVoice target price: $50
 Medium volume
 Performance is moderate (speech
recording)
 Complexity is low
 Canonical SoC + speech input/output + Visor
interface

119
VisorVoice Requirements

120
VisorVoice Functional States
 What states can VisorVoice be in?
 Powered off
 Stopped - not doing anything (play or record)
 Playing - playback of voice recordings
 Recording - capturing a new recording
 Paused – playback/recording temporarily stopped
 Volume adjust - changing volume up/down
 Equalizer - changing settings
 Message Management, Seek - save, restore,
delete, and find messages

121
VisorVoice Functional Diagram

122
VisorVoice User Interface

123
Play/Record Input Controls
 Icon buttons – no value settings
 Play – Starts playback of selected message
 Record – Creates new message with default name
and starts recording
 Pause – Stops playback or record operation and
holds position
 Step back – Moves playback selection backward to
previous start of message
 Step forward – Moves playback selection forward
to next start of message
124
Functional Architecture

125
IP Selection
 GUI
 Custom software to be designed and implemented
to run on the Visor
 PalmOS Development toolsets available (commercial &
open source)
 Springboard Functions provided in a library by
Handspring
 RISC Processor
 Chose the ARM7TDMI processor core
 Available from ARM Ltd
 ISP and VHDL models available

126
IP Selection
 AMBA Bus
 Also available from ARM Ltd
 AMBA Development Kit
 VHDL models of components included
 Visor Interface
 Custom interface to be designed as part of
the SoC implementation
 VHDL model developed
127
IP Selection
 A/D, D/A, Amplifiers, etc.
 IP search resulted in a single IP block solution
from Chipidea
 A 14 bit codec
 Only model available was in a proprietary
language
 Developed MatLab and VHDL models
 VisorVoice Codec
 ChipIdea voice codec CI7075oa includes:
 14-bit linear A-to-D and D-to-A conversion
 Decimation and interpolation filtering
 Microphone and speaker interfacing that works with the
Visor’s microphone and module-mounted speaker
128
IP Selection
 Compression/Decompression Function
 IP search revealed public domain software IP
available
 Used G.726 ADPCM (ITU Standard)
 16 Kbps encoding (32:1 ratio)
 Decision – implement these functions in software
on the ARM processor.
 Equalization Filter
 Design Decision – implemented as a
programmable hardware FIR filter with
coefficients downloaded by the ARM as a function
of the band settings set by the user via the Visor
GUI
129
Additional Design Decisions
 The CODEC and the Visor interface
requirements were relatively low speed,
hence the decision to interface these
units to the Advanced Peripheral Bus
(APB)
 Required APB-to-AHB bridge
 provided by ARM

130
VisorVoice Architecture

131
Design Platform
 Four Models on a common platform
 “C”-level Behavioral Model
 Software Development Model
 Hardware Development Model
 Co-Verification model
 Common Platform (Mentor tools)
 Seamless – Coverification
 Modelsim – Hardware
 X-Ray – host processor software
 C-Bridge – behavior C-language Model

132
Behavioral/Algorithmic Model

133
Requirements & Results
 Requirements
 C models
 Gen.c, Show.c: generate/save pcm code file words
 control.c : Emulates visor interface from x-ray
window
 decfilt.c, fir.c: Compression/Decompression
 mem.c: Store and receive compressed data
 Results
 Verified performance of operation flows
 C models of system components to use in
software development environment

134
Software Development Model

135
Requirements & Results
 Requirements
 C-languange Models for IP blocks
 Host Software IP
 ARM7TDMI simulation model
 Results
 Tested VisorVoice/ARM host software

136
HW Development Model

137
Requirements & Results
 Requirements
 VHDL models for hardware blocks
 Test Generator software
 Results
 Tested Hardware IP blocks for VisorVoice

138
Cosimulation Model

139
Requirements & Results
 Requirements
 C models for ARM software
 VHDL models for hardware blocks
 Results
 Verified VisorVoice system level design

140
References
 System-on-a-Chip
Verification Methodology
and Techniques
Prakash Rashinkar, Peter
Paterson, Leena Singh,
Kluwer Academic Publishers, 2001

 Prof. Chien-Nan Liu’s slides on


SoC Verification, NCU, Taiwan

141

You might also like