This document summarizes the cell verification process used at Sony, Toshiba, and IBM for a new microprocessor. A hierarchical verification methodology was used, breaking the design into partition, island, unit, and block levels. Key metrics like code coverage, passing rates, reviews, and bug rates were tracked at each level. Overall, the methodology was effective, finding 95% of bugs at lower levels and only 3.5% remaining at the full-chip level.
3. Background
Sony, Toshiba and IBM started the design process for a new
high performance microprocessor in 2001
IBM’s custom server verification methodology was used as a
base
New Vendor Tool was introduced
IBM Cycle sim was used as the simulator of choice
Most of the other verification tools used were IBM internal tools
Cell processor met it goals and delivered excellent levels of
performance and power efficiency
256 GFlops (SP) @4GHz
~234M transistors
~235mm2 - 90nm SOI process
3
4. Overview
Multi-core and non-homogeneous architecture
One control optimized core - PPE
Management and Allocation of tasks for SPEs
Eight compute optimized cores – SPEs
High computational tasks
XDR Memory Interface
Custom IO Interface
Three completely different asynchronous clocks
Non-critical logic run with 2x clock
4
5. Architecture
SPU SPU
…
LS MFC LS MFC
PPE SPE0 SPE7
Bus (EIB)
Bus Interface Controller Memory Interface
(BIC) Controller (MIC)
To Southbridge Chip or To Main Memory
to another Cell Chip
5
7. Verification Planning
Planning Theory : Top Down Specification/ Bottom up
Implementation
Plan out all environments needed to create a quality
chip
Break design into partition, island, unit, and block
levels
Define clear goals for each level
Implement environments from block level up
7
9. Statistics
Multi-processor
Cell, 4 SPE Cell correctness
SPE, PPE Connectedness
Interface assumption
SPU validation
BIC
MFC
MIC PU
Processor element
functionality validation
EIB
Unit/Island Environments Thorough coverage
Found 95% of Bugs
Functionality testing
Partitions: 0.2% of Bugs
30 Simulation
Full Chip: 3.5% of Bugs Environments
Verification Environment Hierarchies
9
10. Metrics
The major metrics are
Effective Passing Rate
Includes testcase written as per plan, testcase running and
testcase passing
Effective Coverage
Includes coverage implemented as per plan, coverage on line
and coverage hit
Checkers
Includes % of checkers implemented and on line as per plan
Reviews
Includes % of the reviews conducted as per plan
Bug Rate
Includes logic as well as environment
Sim Cycles
10
11. Metrics – cont…
These metrics are assigned weights which are
different for different environments.
Different Environments are also assigned weights as
per the complexity, newness etc.
Based on all these information, verification progress
can be calculated which is mostly upward curve ( as
apposed to something which keeps on going up and
down all the time)
11