Lec2 Introduction

Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

ASIC Verification

Introduction to Verification

Fall 2011
Meeta Yadav

©2011, Meeta Yadav 1


Why do we need Functional Verification?

The age of digital


convergence

• Designs are becoming more complex


 Increased functionality increases
the number of transistors and the
possibility of error in the design
• The cost of undetected problems grow
over time

Cost
 There is little cost in detecting a
problem during device verification
but the cost increases substantially
if the bug is detected by the Verification Systems Customer
customer Test
Time
Source: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier

©2011, Meeta Yadav 2


Which is more complex?

Saturn V moon rocket 45 nm ASIC

300,000 parts in 70,000 1 Billion transistors with ~10 Billion


assemblies Sub-compoments

©2011, Meeta Yadav 3


Design Failures: Example

• Pentium floating point bug* (1994)


 Chip generated inaccurate computations to certain floating point
calculations
x = 4195835, y = 3145727, and z = x - (x/y)*y
gave the answer as 256 instead of 0
 Error occurred due to omission of entries and was hard to detect since
only 1 in 9 billion calculations were affected

• Explosion of Ariane 5** rocket (1996)


 Unmanned Ariane 5 rocket of the European Space Agency exploded 40
seconds after lift-off (Cost $7 billion + $500 million)
 Error in conversion of 64 bit horizontal velocity number to a 16 bit signed
integer
* http://www.maa.org/mathland/mathland_5_12.html
** http://www.cnn.com/WORLD/9606/04/rocket.explode/

©2011, Meeta Yadav 4


Verification Trends
Principle contributors:
- Functional bugs
- Clocking related bugs
IC/ASIC Designs Requiring Re-Spins by Type of Flaw

Logic/Functional
Clocking
Tuning Analog Circuit
Fast Path
Yield/Reliability
Delays/Glitches
Slow Path
Mixed-Signal Interface
Power Consumption
IR Drops
Firmware 2002 Market Study
Other 2004 Market Study

0% 20% 40% 60% 80% 100%


Percent of Designs Requiring Two or More Silicon Spins
[Collet 2005]

50% of ASICS require more than one respin

5
©2011, Meeta Yadav 5
The Verification Challenge
• Two major verification challenges are:
 The challenge of state space explosion There are more theoretical
► Formal Verification states in today’s design than
there are atoms in the
universe!!!

 The challenge of detecting incorrect behavior


► Functional Verification
– Constrained Random Tests
– Coverage
– Assertions

6
©2011, Meeta Yadav 6
The Verification challenge

Verification Design
Gap Gap

Ability to Fabricate
Design Size in Millions of Gates

Ability to Design

Ability to Verify (Directed)

 The Design & Verification Gap


— Many companies still using 1990’s techniques
— Traditional verification techniques can’t keep up with today’s designs

* Based on data from the Collett International 2004 FV Survey

7
©2011, Meeta Yadav 7
Traffic Controller
Design description
 Light should stay green for
1 minute in each direction
when the intersection is
busy

Main Street

Elm Street

©2011, Meeta Yadav 8


Traffic Controller
Algorithm Implemented

Wait 60
seconds

no yes
Main St. traffic?

no yes Main St.


Elm St. traffic?
turns green

Is there a problem
Elm St.
with the design? turns green

Algorithm for a Traffic Controller by Eagelton Signal Controllers


And Parking Engineering Solutions (ESCAPES)
Source: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier

©2011, Meeta Yadav 9


Functional Verification
• Goals of Functional verification are to Customer
requirements
ensure
 The design accurately performs the tasks General
intended by the overall system architecture specification
and architecture
► To detect a missing feature
► To detect a missing corner condition
High level
 There are no bugs in the tasks chip design
implemented in the design
HDL
Functional
implementation
verification
at RTL level
Fixes to
HDL
Functional Verification testbenches
should be built from the general Physical circuit
design via Fabricated Chip
specifications and architecture synthesis

Source: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier

©2011, Meeta Yadav 10


Functional Verification
Customer
Customer
A well-verified chip reduces cost by requirements

avoiding:
 Re-fabrication General
specification
 Re-calls and architecture

High level
chip design
Manufacturing
Fixes to
HDL HDL Functional
implementation
verification
at RTL level

Physical circuit
Timing Analysis design via Fabricated Chip
synthesis

System Testing
Source: Will, Goss, Roesner:
Comprehensive Functional
Verification: Elsevier Design Process
©2011, Meeta Yadav 11
Cost of Bugs

Cost

Verification Systems Customer


Test

Time

The cost of undetected problems grow over time


Source: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier

12
©2011, Meeta Yadav 12
Verification Productivity

Productivity improvements
drive early problem discovery
Number of Bugs

Verification Systems Test

Time

Verification productivity reduces cost and schedule


Source: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier

13
©2011, Meeta Yadav 13
Overall Testbench Functionality
• Generate stimulus
• Apply stimulus to the Design Under Test (DUT)
• Capture the response
• Check for correctness
• Measure the progress against the overall verification goals

Progress Check and Control of


Verification Process
Stimulus Generation

Correctness Check
Golden Model
Application

Response
Stimulus

Capture
Design
Under
Test

©2011, Meeta Yadav 14


Directed Testing Approach
• Directed Test Progress
 Writing stimulus vectors to exercise various individual features of the design
 Is a time and resource consuming exercise
 Good for small test space with limited variations

100%
Coverage

Directed
Test

Time
Directed Test Progress

Time and resource consuming

©2011, Meeta Yadav 15


Directed Testing Approach
• Directed Testing Coverage
 Individual test cases cover individual features in the design space
 Over time the entire design space can be covered
 Increase in the complexity of the design causes an increase in the time to
cover the entire design space

Uncovered
Covered Features
Features

Bug

Directed Test Coverage

©2011, Meeta Yadav 16


Randomization

• Why Randomize?
 Create VALID “corner” states : system states of maximum device
discomfort
• What do you randomize?
 Device Configuration: Move device to states of least probability
 Input data
 Error handling:
► Exception handling by protocols (bus protocols, handshaking, memory
interface)
► Recovery from errors in system states
 Relative timing between blocks
► Blocks operating at different rates

©2011, Meeta Yadav 17


Stimulus: Constrained Random Testing

• Constrained Random Test Progress


• Constrain stimulus to VALID limits and allow tool to generate values at random within limits
• Random stimulus is required to exercise a complex design
• Useful in finding unanticipated bugs
Less time to achieve
100% coverage
100%
Coverage

Random
Test

Directed
Test

Time
More time to write
Constrained random tests
Constrained Random Test Progress

Achieves coverage faster

©2011, Meeta Yadav 18


Stimulus: Constrained Random Testing
• Constrained Random Test Coverage
 Random tests cover wider design space than directed tests
 Tests may overlap
 Directed tests are still needed to test uncovered features

New Area

?
Test Overlap

Directed
testcase
? ?
Constrained-Random Test Coverage

©2011, Meeta Yadav 19


Coverage Convergence
How do we achieve coverage convergence?
 Start with a fully randomizable testbench
► Run many randomized simulations
► Analyze cumulative coverage and coverage holes
 Then with minimal code changes
► Add constrained stimulus to fill coverage holes
 Finally:
► Make few directed tests to hit the remaining holes

Constrained Random Many runs


Tests different seeds

Add Directed Functional


constraints Tests Coverage

Minimal Code Identify


Modifications holes

©2011, Meeta Yadav 20


Coverage Feedback

• For a large verification space consider coverage feedback

Coverage (% of possible bugs


Constrained Random without
Coverage Feedback

Constrained Random with


checked)

Coverage Feedback

Directed tests

Latency for coding


constrained random tests

Time (Code Writing and output


checking)

©2011, Meeta Yadav 21


SystemVerilog Assertions in Verification Strategy

• Assertions:
1. Captures Designer Intent
2. Allows protocols to be defined and verified
3. Reduces time to market
4. Greatly simplifies the verification of reusable IP
5. Facilitates functional coverage metrics

©2011, Meeta Yadav 22


SystemVerilog Assertions in Verification Strategy

Block Design System Integration


Bugs

Ship much
earlier with less Ship much later with margin of error
risk and cost
Ship

Upfront Cost Time to Market

Time

Assertions reduce time to market

©2011, Meeta Yadav 23


Writing an Effective Testbench

• Testbench:
 Emulates the environment of the DUT
• Features of an effective testbench
 Re-usable and easy to modify for different DUTs
► Object-oriented!!
 Layered: Orthogonalize concerns
► Transactors (encapsulation of protocol) and Interfaces!!!!!
 Catches bugs or achieves coverage quickly
► Randomizable!!

©2011, Meeta Yadav 24


Benefits of testbench environment

• Environment updating takes less time


• Testbench is easy to constrain from the top level file
• All Legal Device Configurations are tested
 Regression can select different DUT configurations
 Configuration object is randomized and constrained
• Enables reuse

©2011, Meeta Yadav 25


Levels of Verification

System

System
level
Board Backplane Node
Board
level
Peripheral Processor Local Bus Memory PCI South Peripheral
memory arbiter controller bridge
Chip
level
DMA Cache ALU FPU X Y
Unit
level
Designer Memory Memory Bus Snoop
Access unit Unit
level
Hierarchical Diagram of Multiple Node System

Figure Courtesy: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier

©2011, Meeta Yadav 26


Levels of Verification and Bugs Detected

Bugs

Time

Designer Unit Chip System


Figure Courtesy: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier

Lower levels of verification tend to uncover more bugs since they occur earlier
in the design cycle and because verification of each designer or unit level occurs in
parallel with the others. It is a good practice to wait until the bug rate begins to drop
in the low levels before moving to the next level

©2011, Meeta Yadav 27


Thank You

©2011, Meeta Yadav 28

You might also like