Verification and validation of OMAP chips is a large, complex process that involves verifying modules, subsystems, and the full chip. It utilizes a strict methodology with defined verification plans, reviews, and metrics to help ensure functionality is thoroughly tested across all levels from RTL to silicon. Automation, reuse, and collaboration across teams are important aspects that help make the process feasible given its size and resources required.
4. OMAP2420™ Overview
Verification and
Validation • ARM1136 Based Soc
includes
– 330 MHz ARM1136
– 330 MHz ARM1136
– 220 MHz TI
– 220 MHz TI
TMS320C55xTM DSP
TMS320C55xTM DSP
– 2D/3D graphics
– 2D/3D graphics
accelerator
accelerator
– Imaging and Video
– Imaging and Video
accelerator
accelerator
– High-performance
– High-performance
system interconnects
system interconnects
and industry-standard
and industry-standard
peripherals
peripherals
5. OMAP3430™ overview
• New OMAP™ 3 architecture combines
mobile entertainment with high
Verification and •
performance productivity applications
Industry's first processor with advanced
Validation Superscalar ARM® Cortex™-A8 RISC
core enabling 3x gain in performance
• Industry's first processor designed in 65-
nm CMOS process technology adds
processing performance
• IVA™ 2+ (Image Video Audio)
accelerator enables multi-standard
(MPEG4, WMV9, RealVideo, H263, H264)
encode/decode at D1 (720x480 pixels) 30
fps
• Integrated image signal processor (ISP)
for faster, higher-quality image capture
and lower system cost
• Flexible system support
– Composite and S-video TV output
– XGA (1024x768 pixels), 16M-color (24-bit
definition) display support
– Flatlink™ 3G-compliant serial display and
parallel display support
– High Speed USB2.0 On-The-Go support
• Seamless connectivity to Hard Disk
Drive (HDD) devices for mass storage
• Leverages SmartReflex™ technologies
for advanced power reduction
• M-shield™ mobile security enhanced
with ARM TrustZone™ support
• Software-compatible with OMAP™ 2
processors
• HLOS support for customizable interface
6. OMAP development organization
Verification and
Validation
• OMAP chip level is divided into several subsystems (e.g.
ARMSS/DSPSS/…)
• Each subsystem consist of key IPs
– E.g. ARMSS ARM core, interrupt controller, security block,
bus converter bridges
– Some IPs are reused from earlier programs, some are
developed for a target program
• Each IP/group of IPs are developed and delivered to
subsystem (s) by IP teams (spanned in different
continents)
• Each Subsystem integrates and tests IPs together and
delivers subsystem to chip level
• Chip level integrates subsystems, peripherals, power and
clock hookup and tests at chip level
7. How it is organized
Verification and
Validation
Validation Infrastructure
Chip level teams
Database
FPGA, Silicon
RTL Verification PD DFT HW Acc Validation Flow
Tracking systems
Subsystem Subsystem
RTL Verification PD DFT RTL Verification PD DFT
IP IP ……. IP IP IP ……. IP
• Now imagine that with ~70 IPs, 10-15 subsystems per chip and 4-5 new chips being
done simultaneously (in parallel with 5 chips doing revisions) and 5 time zones
8. How do we do it (and get it right most
Verification and
of the time!)
Validation
• AFV (Architecture for verification)
• Strict IP to chip release criteria
• Established IP-2-chip exchange mechanism
• Automation
• Common database / infrastructure / tracking
• And of course by increasing frequent flier miles
9. Architecture for verification
Verification and
Validation
• It was all kinds of bus protocols and behaviors
in OMAP1 series of products
• OMAP2/OMAP3
– Standard bus protocol interconnect
– All masters and slaves follow variations of same protocol
– Plug-and-play
• Not everything is so perfect
• Power and clock hookup / verification is challenging
• Debug protocol complicated
10. IP to chip release
Verification and
Validation
• Pre-defined RTL milestones
• Ordered by RTL maturity
– Verification status
– Physical design step completion
• Clear exit criteria
• Same for all IPs / subsystems
• But
– Exception always exists
– Had to accept/integrate/test critical IPs before they have
completed
11. IP to Chip milestones
Verification and
Validation
Chip DB setup/planning Integration RTL verification Physical Design
IP
DB set up / Planning Basic testing >80% done 100% verification
Reviews
12. IP to chip exchange
Verification and
Validation
• Design delivery (standard views)
– RTL
– Timing related
– Physical design related
– …
• Verification delivery
– Tests/libraries/macros from processor-based
subsystems
– Test plans of subsystems for chip level review
13. Automation
Verification and
Validation
• Automate a lot of chip level RTL coding
– Hookup
– I/O connection
– Register configuration
• Automatically generate tests to check
these features
14. Common database / infrastructure
Verification and
Validation
• Centralized infrastructure
• Common database for delivery /
exchange
• IP delivery and quality tracking
• Dedicated infrastructure team
15. Functional Verification Methodology –
same established principle
Verification and
Validation
• Detail
verification
plan
• Reviews at
critical design
points
• Thorough
tracking
16. Verification Methodology
Verification and
Validation
Verification Process – checkpoints / reviews
Design Verification Toolkit / Regression Manager / Verification Dashboard
Verification Metrics – coverage, bugs, regression, formal, cycles, efficiency tracking
Functional Coverage driven Functional Scenario driven Application driven
HVL test bench / scoreboard / checker / assertions HDL test bench
Constraint random testing Directed and Random testing C/ASM based directed testing
Same environment as chip level
Reusable test environment Mimic chip level constraints Reuse from module
Application threads
Reusable stimulus Reuse module level Synthesizable test bench
Operating System boot up
Exhaustive black/grey box environment
Module/Block Subsystem Chip Hardware
17. Module level verification
Verification and
Validation • Objective
– Validate module thoroughly before
subsystem/system integration
• Goal
– To achieve 100% code and functional coverage
• Strategy
– Use pseudo-random test generator
– Base infrastructure
• A common methodology is used for all module
verification
• Common VIPs are used by modules following
same protocols
– Derived components for specific modules
– Black-box approach (primarily)
18. Module level verification
?
Verification and Data
Scoreboard
Validation ?
Register
• Stimulus: Directed- Scoreboard
Expected data
random / random Monitor
BFM
• Correctness: Protocol
Checker Monitor
Coverage BFM
Checker
and Data checkers
Input Port1
Coverage
(end-to-end) DESIGN
UNDER
Output
Port
Input Port2
VERIFICATION
• Coverage: Code and
functional coverage Monitor
• Property checking for
BFM
Checker
Coverage
certain blocks
19. Subsystem level verification
Verification and• Objective
Validation – To validate the subsystems in the design before top-level integration
– Debug/isolate problems inside subsystems which are difficult to find in large
SOC
• Goal
– 100% completion of directed tests as per the verification spec
• Core CPU tests
• Feature specific directed tests
– 100% functional coverage items re-used from IP level verification
– 100% Coverage of a Manual Checklist created for test items
• Strategy
– Generate test bench irritation while processor running real code
– Reuse of module components
– Isolate subsystem and mimic system environment to create top-level
scenarios with a much lesser simulation time
20. Subsystem level verification
Verification and
Validation
CLOCK/
RESET
INTERRUPT
• Stimulus: C/ASM tests for integration, boundary and functional testing
• Correctness: Self-checking testing, Checkers reused from module-level
• Coverage: Toggle at boundary, directed tests of all target features in the spec
21. Example: The ARM1136J(F)-S
Verification and
Subsystem test scenarios
Validation • Reuse ARM IP test suite
- Retarget CPU tests at the subsystem level
- Tests that cover various AHB parameters
- Basic Boot Tests
- Exception testing at subsystem context
- Clock and power management tests
- Feature specific testing (interrupt handling, security …)
- Derivative tests
- Base tests with varying test bench parameters
- Data Memory Access Tests with variable wait states in
memory
- Tests run with random clock speed with allowable speed limit
- Random interrupts
22. ARM Subsystem verification
Verification and
environment Components
Validation
• Mandatory components
– A Clock/Reset/Idle Control Block :
• For creating multiple clock frequencies
,random/controlled reset and idle
– An Interrupt Generator BFM :
• For Generating random/controlled
simultaneous interrupts and handling them
– Memory interface and Memory with
variable/random wait states:
• Memory model to support Instruction Read,
Data Read/Write with random latency
• Optional components
– Internal Protocol Checkers
• Mainly re-use from module level verification
23. Chip level verification
Verification and
Validation
• Objective
– To validate chip integration and handshaking
– To validate real chip level functional scenario
• Goal
– 100% scenario covered as in the plan
• Strategy
– Mimic chip environment
– Base SW environment for ease of reuse
– Break into multiple master-slave blocks
– Mix and match of real RTL and bus functional
models
24. Chip-Level Verification
Verification and • Stimulus: C/ASM based directed tests – chip functional scenarios
Validation • Correctness: Self-checking tests, Selected checkers from module-
level
• Coverage: 100% completion of all scenarios in the plan
Trace/JTAG Flash SDR/DDR
BFM Models Models
GPIO
UART/ McBSP DRIVERS
ARM BFM DSP BFM Camera
BFM
CLK, Reset
IDLE/Power
Management
Control Block
I/O drivers Display
BFM
25. Simulation environment
Verification and
Validation
• Flexible environment
– Replace RTL by BFMs
– Software models for processors
• Test bench
– Synthesizable
• Dedicated teams for environment and
test bench
26. Software Base
Verification and
Validation
• Test case use library functions
• The Software Development Library
– Library routines are developed based all IP
functional specs and put in a repository
database to be used for all these levels of
verification:
• Subsystem Level
• Top Level
• Chip Level actual Silicon
• A standard Format is used for all
tests/subroutines/libraries
27. Key aspects checked at chip level
Verification and
Validation
• Integration of all subsystems (achieve 100%
toggle)
• Basic features
• Data and control path testing
• Parallel and distributed functionality
• Latency / performance
• Power Management
• Application scenario
• Debug features
29. Beyond RTL
Verification and
Validation
• Hardware acceleration
– Use at subsystem level and chip level
– Stress test
– Basic software checkout
• Prototyping
– Use at chip level
– Early software development
30. Verification Management
Verification and • Detail test plan at every level – module/subsystem/chip
Validation
• Review at critical design points with
design/spec/system teams
• Tracking of
– Verification plan
– Test environment development
– Functional Coverage development
– Coverage achievement (code, function)
– Design defect
– Validation defect
– Test development
– Test regression
– Test cycles
– Assertions (formal and simulation)
31. Metric process
Verification and
Validation tracking
Bug Source code Runtime tools Regression engine Resource estimator
• Internal tool • Clearcase • Modelsim • Internal •MS Project
• TDM • VCS • others • ????
• CVS • Specman
• IUS
Metrics Dashboard
Management request Trend data
• Trend analysis
• Risk analysis
• What If scenarios
32. Verification Metrics
Verification and
Validation
• Required • Desired
– Bug curve (logic, DV) – Sim farm efficiency
– Source code activity (# lines / # edits)
• Software license stall time
– Cycles / bug for random testing
• Setup / cleanup time
– Passing rate
• Cycles / second
• IP level
• Integration • % simulator / % HVL
• System • Average / distribution for # of running
• ECN verification jobs
– Code coverage (line, branch, toggle) • Cycles / hour
– Functional coverage – Resource stats
• Level1 : Features • Resource ramp vs. forecast
• Level2 : Cross
• Resources invested vs. bottoms-
• Level3 : Scenario
up plan
– DV checkpoint status
33. DV Dashboard
Verification and Simulation,
Validation Formal
3rd Party
Tools
Regression
logs
Internal
Tools DV FLOW
Coverage
logs
UPLOAD (Convert to common format)
Create
Simulation Test
Database Coverage Regression Defects
SQLDB SQLDB SQLDB
Coverage monitor
Database Bug Tracking
Formal Property
Database
34. Overall DV Metric System
Executive Engineering
Verification and Management Management
Validation
Design C
Design D
Design A
Design B
Design E
Design F
RTL DV DFT PD
Design A 80% 40% 30% 40%
Module A, B
DV Methodology
Design B 85% 70% 40% 50%
Bugs
DV Status
Cove Exists: Manually
rage
collected Exists: Automatic
Time
Actual Metrics Methodology
Compliance Review System
DV Dashboard Trend Analysis
Expected
Review Status Metrics
Exists:
Automatic
Regression/Bug/ Actual Metrics Review Database
Coverage Database Exchange
Exists: Automatic
Review checklists
Engineering Analysis of
3rd Party Tool coverage data
35. Summary
Verification and
Validation
• OMAP™ verification is a resource and
time intensive task
• Detail plan and review at all levels
eliminate redundancy and provide
maximum coverage of functions
• Need collaboration at every level
– Architecture
– Design
– Infrastructure
– Verification
• No magic