macro verification guidelines chapter 7.. chap 7. macro verification guidelines the goal of macro...
TRANSCRIPT
Macro Verification Guidelines
Chapter 7.
Chap 7.Macro Verification Guidelines
The goal of macro verification The macro is 100 percent correct in its functionality and ti
ming. This Chapter’s Topic
Overview of macro verification Testbench design Timing verification
7.1Overview of Macro Verification
Design verification is one of the most difficult and challenging aspects of design.
Some particular challenges The verification goal must be for zero defects because the macro may
be used in many applications. The goal of zero defects must be achieved for all legal configurations
of the macro. The integration team must be able to reuse the macro-level testbench
es and test suites. The entire set of testbenches and test suites must be reusable by ot
her design teams. The testbenches must be compatible with the verification tools used t
hroughout the system testing process.
7.1.1Verification plan
It is essential that comprehensive functional verification plans be created, reviewed, and followed by the design team.
By defining the verification plan early, the design team can develop the verification environment, including testbenches and verification suites, early in the design cycle.
The verification environment is the set of testbench components.
The verification plan should be fully described either in the functional specification for the macro or in a separate verification document.
7.1.2Verification Strategy
Verification of a macro consists of three major phases Verification of individual subblocks Macro verification Prototyping
Overall verification strategy is to achieve a very high level of test coverage at the subblock level, and then to focus the macro-level verification at the interfaces between blocks, overall macro functionality, and corner cases of behavior.
At each phase of the verification process, the team needs to decide what kinds of test to run, and what verification tools to use to run them.
The basic types of verification tests Compliance testing
Compliance to the functional specification for the design is checked as fully as possible.
Corner case testing These tests try to find the complex scenarios, or corner cases, that are
most likely to break the design. Random testing
Can create scenarios that the engineers do not anticipate, and often uncover the most obscure bugs in a design unlike compliance and cornet case test.
Real code testing Running the design in a real application, with real code.
Regression testing Verify that the existing baseline of functionality is maintained as new
features are added and bugs are fixed.
7.1.2Verification Strategy
The verification tools available to the macro design team Simulation
Most of the macro verification is performed by simulating the RTL on an event-driven simulator.
Event-driven simulators give a good compromise between fast compile times and good simulation speed at the RTL level.
Testbench Automation Tools Vera and Specman Elite tools can aid the task of creating reusable testbe
nches. Provide mechanisms for generating random tests and for checking test co
verage Provide mechanisms for communication between testbench objects
Code Coverage Tools VHDLCover, VeriSure, VeriCov, CoverMeter tools provide the ability to ass
ess the quality of the verification suite.
7.1.2Verification Strategy
The verification tools available to the macro design team
Hardware modeling A Hardware modeler provides an interface between a physical
chip and the software simulator. Allow the designer to compare the simulation results of the RTL
design with those of an actual chip. Emulation
It is an appropriate tool for running real code on a large design, but is not a very useful tool for small macro development.
Prototyping Allows execution of real code in a real application at real-time
speeds. Prototyping should only occur late in the design phase.
7.1.2Verification Strategy
7.1.3 – 7.4.5 SUMMARY
초고속 지능망 전송 / 교환 실험실전 승 용
Goal at this stage : 100 percent statement and path coverage as
measured by a test coverage tool such as VeriSure
or VHDLCover.
The best way to do this is to add checking logic to the testbench.
Automated response checking is superior to visual verification of wave
forms because: It is less error-prone.
It enables checking of longer tests.
It enables checking of random tests.
7.1.3 Subblock Simulation
Rule : All response checking should be done automatically.
Guideline : Subblock testing is the easiest place to detect design error
s.
7.1.3 Subblock Simulation(Cont’)
7.1.4 Macro Simulation
The major source of errors at the macro integration level will either be interface problems or the result of the design misunderstanding the specification.
The emphasis at this stage is on testing the interaction between t
he component subblocks and the interfaces of the macro with its
environment.
7.1.5 Prototyping
The design reuse methodology encourages rapid
prototyping to complement simulation, and to
compensate for the less-than-100 percent coverage at
the macro verification stage.
Building a prototype ASIC is required for macros that
must be tested at speeds or gate counts exceeding those
of FPGA and laser technologies.
7.1.6 Limited production
Typically, limited production involves working with just a
few (1-4) customers and making sure that they are
successful using the macro, before releasing the macro
to widespread distribution.
7.2 Inspection as Verification
Code inspections are much more effective that test and
debug for finding bugs.
Typical approach to performing design and code reviews:
Design review
reviews the requirements for the design, and
describes in some detail how the design meets
these requirements
purpose is to review the approach to solving the
problem, and to make sure that it is sound
Code review object of the code review is to do a detailed peer review o
f the code
it is usually done after a subblock has been designed and
verified, and before it is integrated into the macro
Lining tools such as Verilint, VHDLlint, and tools from Leda S.A.
and Escalade can check for a variety of potential sources of error.
7.2 Inspection as Verification(cont’)
7.3 Adversarial Testing
Subblock or unit testing is done by the designer, and typically muc
h of the macro verification is done by the design team.
The combination of these two approaches usually gives the best r
esults.
7.4 Testbench Design
Testbench design differs depending on the function of the macro.
There are significant differences between subblock testbench desi
gn and top-level macro testbench design.
7.4.1 Subblock Testbench
Output Transaction
Checker
Input TransactionGenerator
Inp
ut
Inte
rface
Ou
tpu
t In
terf
ace
Figure 7-1. typical testbench for a subblock
Testbenches for subblocks tend to be rather ad hoc, developed specifically for the subblock under test.
At some abstract level, though, they tend to look like Figure 7-1.
7.4.1 Subblock Testbench(Cont’)
Stimulus Generation
We start by analyzing the functionality of the subblock to deter
mine useful sequences that will verify that the subblock compli
es with the specification.
Then we search for the corner cases of the design: those sequ
ences of combinations of transaction and data values that are
most likely to break the design.
7.4.1 Subblock Testbench(Cont’)
Once we have developed all the tests we can in this
manner, we ruin a code coverage tool.
This tool gives us a good indication of the
completeness of the test suite.
Output Checking
An automatic output checker is a necessary part of the testbe
nch.
Some common aspects to most checkers:
Verify that only legal transactions are generated at the outp
ut port of the design.
Verify that the specific transactions are correct responses t
o the input transactions generated.
7.4.1 Subblock Testbench(Cont’)
7.4.2 Macro Testbench
There are several reasons why we want to develop a more powerful and well-documented testbench at this level:
The design is now considerably more complex, and so more complex test scenarios will be required for complete testing.
More people will typically be working on verification of the macro, often the entire team that developed the subblocks.
The testbench will be shipped along with the macro so that the customer can verify the macro in the system design.
7.4.2 Macro Testbench(Cont’)
An interface macro, such as a PCI interface might have a testbenc
h like the one shown in Figure 7-2.
PCI BFMPrimaryMaster
PCI BFMPrimaryMaster
PCI BFMPrimaryMaster
PCI BFMPrimaryMaster
PCI BFMPrimaryMaster
PC
I B
us
TestbenchCommand File
PCIMacro
MonitorLog File
Ap
plic
ati
on
In
terf
ace
-
Sla
ve
Ap
plic
ati
on
In
terf
ace
-
Mast
er
On-the-FlyChecker
ApplicationBFC
Master
ApplicationMonitorslave
ApplicationBFC
Master
ApplicationMonitorslave
TEST
BENCH
MonitorLog File
CheckLog File
MonitorLog File
Figure 7-2 Macro development and verification environment
ApplicationSoftware
Drivers
Translator
PCI BusFunctional
Model
HW/SW
Cosim
Environment
Application BusFunctional
Model
PCIMacro
PCIBus
Monitor
ApplicationBus
Monitor
Figure 7-3 Software-driven testbench for macro-level testing
A more complex testbench is shown in Figure 7-3.
7.4.2 Macro Testbench(Cont’)
7.4.3 Bus Functional models
The bus functional models(BFM) are a common method of creatin
g testbenches.
The intent of these models is to model only the bus transactions
of an agent on the bus.
Well-designed BFMs allow the test developer to specify the transa
ctions on the bus at a relatively high level of abstraction.
BFMs are also extremely useful for system-level simulation.
7.4.4 Automated Response Checking
One effective technique is to compare the output
responses of the macro to those of a reference design.
If the macro is being designed to be compatible with an
existing chip, then the chip itself can be used as a
reference model.
A hardware modeler can be used to integrate the
physical chip as a model in the simulation environment.
Stimulus
Stimulus
8051 chip
8051 macro(RTL)
Hardward Modeler
Compareresponse
Figure 7-4 Self-checking testbench using as hardware modeler
7.4.4 Automated Response Checking(Cont’)
Figure 7-4 shows a such a configuration.
7.4.5 Verification Suite Design
Developing a verification environment that can
completely verify a macro is every bit as difficult as
specifying or designing the macro in the first place.
Functional Testing
The first step in developing the verification suite.
It is verifying that the macro implements the functions describ
ed in the specification.
Testbench automation tools can help by providing powerful con
structs for describing this functionality.
Functional tests are necessarily a subset of a complete verific
ation of the macro.
7.4.5 Verification Suite Design(Cont’)
Corner Case Testing
Corner case testing is intended to test the
implementation details not covered in the functional
testing.
Another typical set of corner cases involve designs
with shared resources, such as a bus arbiter, or a
block that has multiple agents accessing the same
FIFO.
7.4.5 Verification Suite Design(Cont’)
Code Coverage and Random Testing
There are two useful techniques for completing the
verification suite: code coverage & random testing
Code coverage
It is indicates what parts of the code have been tested.
This information allows the designer to create focused
tests for these untested section of code.
When manually adding new tests has become tedious or
impractical, random testing can help test coverage.
7.4.5 Verification Suite Design(Cont’)
Random testing
It greatly enhances our verification capabilities, but it
does have limitations.
Runtimes can get very long for achieving very high
coverage.
7.4.5 Verification Suite Design(Cont’)
7.4.6 Code Coverage Analysis
The coverage tool provides: Statement coverage Branch coverage Condition coverage Path coverage Toggle coverage Triggering coverage
Types of Coverage Tests
State coverage gives a count, for each executable statement, of how many times it was executed.
Branch coverage verifies that each branch in an if-then-else or case statement was executed.
Condition coverage verifies that all branch sub-conditions have triggered the condition branch.
Path coverage checks which paths are taken between adjacent blocks of conditional code.
Triggering coverage checks which signals in a sensitivity list trigger a process.
Toggle coverage counts how many times a particular signal transitions from ‘0’ to ‘1’ and how many times it transitions from ‘1’ to ‘0’.
Recent Progress in Coverage Tools
Code coverage can be used to prune the overall test suite, eliminating redundant tests and ordering tests so that the first tests run provide the highest incremental coverage.
Code coverage tools are still limited in their coverage; see the comments on path coverage above.
7.5 Timing Verification
Static timing verification is the most effective method of verifying a macro’s timing performance.
Static timing analysis should be performed on the resulting netlists to verify that they meet the macro’s timing objectives
Gate-level simulation is most useful in verifying timing for asynchronous logic.
Static timing verification tends to be pessimistic unless false paths are manually defined and not considered in the analysis.
The best overall timing verification methodology is to use static timing analysis as the basis for timing verification.