©2008, Dr. Meeta Yadav, www4.ncsu.
edu/~myadav/Research 1
Outline
1. Verification Plan
2. Verification Environment
3. Verification Guidelines
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 2
Strategy of Verification?
How do I verify my design?
• What resources will I need?
• Am I driving all possible input scenarios?
• How will I know a failure has occurred?
• How do I measure my progress?
• When will I be done?
Verification Plan
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 3
Evolution of the Verification Plan
Continuously
updated
Design and Verification follow the “Waterfall” flow
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research Source: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier 4
The Verification Plan
• Contents of a Verification Plan
Description of verification levels
Required tools
Risks and dependencies
Functions to be verified
Specific tests and methods
Coverage requirements
Test case scenarios
Resource requirements
Schedule details
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 5
Contents of Verification Plan
• Description of Verification Levels
Articulating multiple levels of design hierarchy
Group levels together into functional components
► Complexity of individual components
– Proper verification on complex functions requires a high level of
control and observability
– Simple functions which do not require high level of control and
observability can be combined with other levels
► Existence of clean interface and specification
– “A moving target” interface requires the function should be
verified individually
– Stable interfaces with simple functions can be combined with
other levels
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 6
Contents of Verification Plan
• Required Tools
Event simulation tools for units
Cycle simulation tools for chip
Formal verification tools
Assertion-based tools
Debuggers
Emulation hardware
Acceleration hardware
Co-simulation hardware
High-level verification languages
Libraries of functions
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 7
Contents of Verification Plan
• Risks and Dependencies
Tool based risks
► Delivery and startup delays
► Integration with established tools
► Educational challenges
On-time HDL delivery dependency : HDL delivery might be
scheduled such that simple functions are delivered first and complex functions
are delivered later
Reliance on a separate verification team
Architecture closure
► Unresolved specification issues
Sufficient resources
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 8
Contents of Verification Plan
• Functions to be verified
Critical functions : Functions the team needs to verify before using the design elsewhere
Secondary functions
► Non-critical to tapeout
– Performance related functions
– Functions to be enabled in the later version of the chip
– Functions with software backup
► Non-critical to the next level of verification
– Functions that can be verified in parallel with the next level of verification
– Corner case conditions
Pervasive functions : Operations that do not occur during normal running conditions
► System resets
► Error handling
► System debug
Non-verified functions at this level
► Team fully verified the function at a lower level and the function will be verified again
at a higher level through simulation
► Function is not applicable to this level of verification
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 9
Contents of Verification Plan
• Specific tests and methods
What type of verification?
► The functions to be verified
► Exercising the internal structures
► Error manifestation
► Availability of resources
Verification strategy
► Deterministic simulation
– Used for simple designs
► Random based simulation
– Complex functions
► Formal verification
– Small, complex blocks of design for which many permutations exist
Random aspects
► Hangs due to looping
► Low activity scenarios
► Specific directed tests
Abstraction level
Checking Strategy
► White box testing
► Grey box testing
► Black box testing
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 10
Contents of Verification Plan
• Coverage Requirements
Define coverage goals: A feedback mechanism that evaluates the quality of the verification
environment’s stimulus generation components
► The environment has exercised all types of commands and transactions
► The stimulus has created a specific or varying range of data types
► The environment has driven varying degrees of legal concurrent stimulus
► The initiator and responder components have driven errors into the DUV
Measure coverage progress
Fill coverage holes
Write directed test cases if necessary
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 11
Contents of Verification Plan
• Test Case Scenarios: Matrix
List of interesting test scenarios
► Configurations to verify
► Variations of the data items in the environment
► Important attributes of data items
► Interesting sequences for every DUV input port
► Error conditions
► Corner cases
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 12
Contents of Verification Plan
• Resource Requirements
Manpower
► Type of environment
– Reference model checking environment requires more people
– Transaction based environments require less people
► Experience of individuals
Computation resources
► (Length of one test scenario X Number of tests to be run) determines the compute
as well as license resources
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 13
Contents of Verification Plan
• Schedule Details
Time-line for different verification activities
should list
► Deliveries of verification team
► Verification work items
Schedule should contain
Bugs
► Specification delivery
► Verification environment development
► First HDL delivery
► Verification update
► Regression run Tim e
► Release to manufacturing Designer Unit Chip System
Schedule should account for each level of
hierarchy Lower levels of verification tend to uncover more bugs since they
Verification should be moved to the next level occur earlier in the design cycle and because verification of each
when the bug rate in a particular level begins designer or unit level occurs in parallel with the others. It is a good
to drop (estimate this based on history of other practice to wait until the bug rate begins to drop in the low levels
before moving to the next level
projects)
Figure Courtesy: Will, Goss, Roesner: Comprehensive Functional Verification: Elsevier
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 14
Verification Environment
• Testbench components
Testbench wraps around the Design Under Test
► Generate stimulus
► Capture response
► Check for correctness
► Measure progress through coverage numbers
• Features of an effective testbench
Reusable and easy to modify for different DUTs
► Object oriented
Testbench should be layered to enable reuse
► Flat testbenches are hard to modify and control
► Layered testbenches separate code into smaller pieces that can be
developed separately and combine common actions together
Catches bus and achieves coverage quickly
► Randomizable!!!
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 15
Verification Environment: Layered Testbench
task Driver:: send_instr()
• Signal layer ...
DUT and its connections in_box1.get(instr2send);
lc3.cb.complete <= 1;
• Command Layer lc3.cb.dout <= instr2send.op;
Convert from commands send(), ...
read(), write() to signal lines into the endtask
DUT task Driver:: send_address()
Convert signals from output to ...
commands in_box2.get(src1send);
Assertions written and monitored: if(src1send.op == MEM)
lc3.cb.src1_addr =
assertions encode expected src1send.addr1;
behavior of the system. endtask
task Receiver::get_output()
...
Driver Assertions Receiver pkt_to_chk.val1 = lc3.cb.out1;
Command pkt_to_chk.din = lc3.cb.din;
pkt_to_chk.dest = lc3.cb.addr;
Signal
DUT
endtask
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 16
Verification Environment: Layered Testbench
task send_input()
• Functional Layer
...
Convert from high level ...
transactions (TXor) to commands send_instr();
to input to DUT for example DMA send_address();
read operation
send_control();
TXor to a temporarily storage to endtask
keep tabs on input to DUT and the
order of inputs task receive()
Receive data from output of DUT ...
and check with expected result get_output();
check();
...
endtask
Agent Scoreboard Checker
Functional task check()
...
Driver Assertions Receiver get_output();
Command
...
Signal if(pkt_to_chk.data!= expected)
DUT print(“ERROR .. ..“);
endtask
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 17
Verification Environment: Layered Testbench
• Scenario Layer task gen();
// generate an arbitrary number of payloads
Generate inputs of interest // and create random types of inputs
Directed tests or constrained repeat($urandom_range(9,12))
random begin
payload_src1.push_back($random);
payload_src2.push_back($random);
payload_imm.push_back($random);
end
endtask
push_back() is an inbuilt function for queues that inserts the
object specified at the end of the queue. More on this as the
lectures progress.
Scenario Generator
Agent Scoreboard Checker
Functional
Driver Assertions Receiver
Command
Signal
DUT
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 18
Verification Environment: Layered Testbench
• Test Layer and Functional Coverage
Test controls all that goes on in environment
Sets constraints for the stimulus to be sent in
Creates multiple combination of tests
Functional coverage results used to determine constrains for next set of
inputs
Test
Test
Functional Coverage
Generator Environment
Scenario
Functional Agent Scoreboard Checker
Driver Assertions Receiver
Command
Signal
DUT
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 19
Verification Environment: Layered Testbench
• Benefits of a layered testbench environment
Environment updating takes less time
Testbench is easy to constrain from the top level file
All Legal Device Configurations are tested
► Regression can select different DUT configurations
► Configuration object is randomized and constrained
Enables reuse
©2008, Dr. Meeta Yadav, www4.ncsu.edu/~myadav/Research 20