Test and Verification Solutions
ARM Based SOC Verification
Mike Bartley, Avanish Sachan, Abhineet Sanghvi, Christdas
Test and Verification Solutions
7 July 2008 1
Agenda
• System Verification Challenges
• SoC Verification Methodologies
• Formal
• HW/SW
• FPGA Prototyping
• Q&A
7 July 2008
14 March 2013 2
Typical ARM Based SoC Architecture
Debug & Trace
Test I/F
ARM Boot ROM Controller
ARM Core
Boot Config
High Speed
Pheripheral
PLL
External Bus
Interface
TIMER
ARM Interconnect
Memory SubSystem On Chip
Memory Bridge UART
DMA
Controller DDR Cntrl SRAM Cntrl
I2C
GPIO
Low Speed
Pheri pheral
Looks simple BUT ….
7 July 2008
14 March 2013 3
Why is SoC different? System Verification Challenges
High potential bug areas in SoC
Unexpected access conflict between the
shared resources.
Complexities arising out of interaction
between subsystems which were
verified stand alone.
Cache coherency in multi-core system.
Interrupt connectivity and Priority
scheme.
Arbitration priority related issues and
access dead-locks.
Unexpected HW/SW sequencing.
Exception handling conflicts and priority
scheme.
Multiple power domain region, clock
domain crossing.
Multiple reset and clock region.
7 July 2008
14 March 2013 4
Top3 Verification Challenges
• Top1: Mastering Verification Complexity
– Continuous increase in number of IP’s and embedded
processors
• 2006: 30-40 IP’s, 1 CPU
• 2011: 80+ IP’s, 6+ CPU’s
• 2016: 120+ IP’s, 20 CPU’s ?
– The more IP’s the higher the risk of late spec &
implementation changes
– Driving towards true Hw/Sw Co-Verification
29 April 2009 5
2. Scalability
• Constrained-random simulation
has been proven as a good bug-
hunting flow, but...
– How many cycles to verify 2 weeks at
target speed of 1GHz?
• Answer: 0.6 x 1015
Simulation Emulation FPGA Si
(KHz) (1 MHz) (10 MHz) (1 GHz)
Target cycles 1,000,000 sim 1000 emulation 100 FPGA slots 1 chip
1015 slots slots
• How will we scale simulation,
emulation, FPGA to next
generation of CPUs? SoCs?
• What are the alternatives?
29 April 2009 6
ARM SoC DV Methodology
This session describe the current verification methodology used
in SoC verification.
Formal Verification
Hardware Software Co Verification
FPGA Prototyping
7 July 2008
14 March 2013 7
Formal Verification
Formal verification
is a systematic process that uses mathematical
reasoning to verify the design.
works algorithmically and exhaustively explores all
possible input values over time.
no stimulus!
Why use formal?
100% coverage (if you set it up correctly)
difficult to figure out how stimulate the design
difficult to create checker
An ideal formal verification tool requires
capacity in order to handle complex designs
7 July 2008
14 March 2013 8
Simulation Depth-first vs. Formal Breadth-first
Where the nodes are states in the simulation
And the arcs are clocked transitions
But the trees are
– Very wide
– Very deep
Model Checking – a very brief introduction
Inputs to the tool
• 3 inputs to the tool • For example
– A model of the design – Usually RTL
– A property or set of – Items are transmitted to one of three
properties representing destinations within 2 cycles of being
the requirements accepted
• (req_in && gnt_in) |-> ##[1;2]
(req_a || req_b || req_c)
– A set of assumptions,
expressed in the same – The request signal is stable until it is
language as the granted
properties • (req_in && !gnt_out) |-> ##1 req_in
• typically constraints • We would of course need a
on the inputs to the complete set of constraints
design
Apps allow you to auto-generate properties and constraints
INTERCONNECTIVITY VERIFICATION
Formal properties can be written to verify Example
Connectivity between two IP’s within the SoC
hierarchy
Connectivity between the IP and DUT ports Block B
through the pads Block A Block C
PortA1 Port2
The default reset values of all the DUT ports
Port1
Pad ring structure and control validation IN1
Point-to-point and Point-to-multipoint
Ctrl
interconnectivity checks
Using Formal analysis for connectivity checking assert_in1__blockc_port1:
facilitates assert property ((dut.ctrl) throughout
dut.in1 == dut.blockb.blockc.port1);
Controllability
assert_blocka__blockc_port2:
Completeness assert property (dut.blocka.porta1
== dut.blockb.blockc.port2 );
Generation of Formal Properties can be
automated by running scripts on spreadsheets.
Formal tools like IFV (Cadence) and Questa
(Mentor) support connectivity checking
7 July 2008
14 March 2013 11
BUS PROTOCOL COMPLIANCE VERIFICATION
Protocol Compliance check using
Formal Verification involves
Development of a formal
specification of the protocol
Bus Interface of each
component modelled as a FSM
Protocol is then defined to be
the synchronous composition of
these state machines
Finite state description of the protocol can be formally verified by a state space
search
This helps in verifying the advanced features of SoC Bus Protocols like
Pipelining - Between address/control and data phases
Wait Cycles - Within the transfers
Split Transfers – To facilitate bus arbitration
7 July 2008
14 March 2013 12
LOW POWER VERIFICATION
The Unified Power Format (UPF) is a low
power specification standard that was
recently approved by Accellera .
It captures the low power design
specification in a portable form so that
correct power management components
Can be implemented at the RTL
Inferred correctly during synthesis
Placed-and-routed efficiently and
accurately in the physical design.
Formal Verification techniques can be used for
UPF specific assertions
Power control sequence-based assertions
Assertions to check incorrect and missing level shifters
Similar usage can also be deduced for designs incorporating Silicon
Integration Initiative’s (Si2’s) Common Power Format (CPF)
7 July 2008
14 March 2013 13
DFT VERIFICATION
Scan Chain Connectivity Verification :
Place-and-route tools change the order
of the scan flops to minimize
congestion. In this process, scan
chains might be disconnected and
lockup latches removed, thereby
effecting test pattern generation or
memory-BIST functionality
Static scan-chain integrity checks
allow for immediate verification that
both the order and number of flops is
as expected for different revisions of
the netlist.
Formal properties can also be written to validate
Test Mode Entry Sequence
Test Mode exit values
Advantages : Reduced DFT verification cycles , Test Mode functional
Validation , Equivalence checking among versions of netlists.
7 July 2008
14 March 2013 14
Model Checking – a very brief introduction
Outputs from the tool
• Proved
– the property holds for all valid sequences of inputs
• Failed(n)
– there is at least one valid sequence of inputs of length n cycles, as
defined by the design clock, for which the property does not hold.
– In this case, the tool gives a waveform demonstrating the failure.
– Most algorithms ensure that n is as small as possible, but some
more advanced algorithms don’t.
• Explored(n)
– there is no way to make the property fail with an input sequence of
n cycles or less
– For large designs, the algorithm can be expensive in both time and
memory and may not terminate
Some Limitations of Formal Verification
Size limit
Not always feasible
Better for control checking
Rather than data
7 July 2008
14 March 2013 16
Hardware Software Co Verification
Today we make products
Made up of hardware and software
The ability to achieve first silicon and first software success
relies on the capabilities of a verification environment to
support full-system hardware/software co-verification
Software engineers need to be able to access hardware design
to integrate software functionality with hardware.
In SoC verification, co-simulation provide the facility to verifying
hardware and software functionality together.
7 July 2008
14 March 2013 17
Top Level Test Generation for CPU
Bias tests to hit interesting
corner cases
Test – Scenario interleaving
Scenario Test
generator – Target shared resources/’points
of convergence’
Compiler Non-repetitive useful tests
flow
There should be an efficient
Coverage
Expected
results
workflow
Testbench
– Generation performance
SoC
– Target diverse platforms
CPU Mem.
– Ease of use
Observe FABRIC – Maintainability
results FABRIC – Reuse (of testing knowledge)
A B C – Effective result checking:
Propagation of results
Trace comparison
BFM BFM
18
Hardware Software Co-Verification Flow
Software Environment Hardware Environment
SW Tools
(Compiler, Linker,
Debugger) HDL Simulation
Tools
Executable Object
file DUT
Memory Model
Output for debugger
tools
7 July 2008
14 March 2013 19
How to go faster!
Compute Farm, Emulators, FPGA and test chips
20
The ‘tradeoffs’ for different platforms
Favours lots of … but also need to load tests
short tests! and dump test results!
Compute farm Emulator FPGA Test chip
Speed 10Hz - 100Hz 1MHz 2MHz – 50MHz GHz
…per machine
Partitioning!
Observability Total Trace window + Probes + Host debug
host debug host debug
Behavioural Yes Co-emulation Co-emulation No
testbench? (speed penalty) (speed penalty)
Test in ‘real world’ No Host debug + Mostly Yes
systems ICE with speed
bridges
Are fails easily Yes Yes No No
reproducible in
simulation? Complex timing dependencies
Bring-up time Minutes Weeks hours Weeks Days Months
Depends on
process maturity
21
FPGA Prototyping
It allows faster simulation and close to real time operation
performance
which could help in identifying a different class of bugs.
Integrated hardware-software testing.
Can be hard to map the design to the FPGA
Debug limited
Provides debug capability through JTAG and specialized
debug infrastructure which is built in to the FPGA.
7 July 2008
14 March 2013 22
Connecting the FPGA with simulation
Inter-Processor
Communication
(socket) Logic Simulation
BFM
FPGA
With Hardware
On board
Design
7 July 2008
14 March 2013 23
Standard Co-Emulation API (SCE-MI)
A standard for writing transactors that connect behavioural
test benches or models to emulators or FPGA platforms
Message or transaction rather than event oriented
Incorporates a synthesisable BFM
A subset of SystemVerilog DPI capabilities
Designed to integrate with SystemC
Hardware
Behavioural Emulator
Monitor
test bench
SCE-MI
SW link
Custom Block
SCE-MI Infrastructure RTL
24
SCE-MI: What are the issues?
Performance
– Need to consider both bandwidth and latency!
– Try to make operations non-blocking
– Performance of the behavioural testbench may limit the maximum speedup
Time
– The time for SCE-MI communications will depend on workstation performance
– Simulation time may be suspended during SCE-MI communications to maintain
determinism by consuming zero emulation time
Concurrency
– Concurrency is created by verification components operating independently of the DUT.
– A component in a process or thread outside the emulator time domain can prevent the
verification environment from being deterministic
Determinism
– Allowing non-determinism may increase performance
– However determinism is crucial for ease of debug
Maintenance
– Each protocol needs a SCE-MI transactor.
– These need to be verified and maintained.
25
FPGA Prototyping Limitations
• Many FPGAs are required for SoC partitioning,
• leading to prototype system complexity
• The target SoC may consist of many functional modules that are written at
different abstraction levels in the course of different development phases.
• Only synthesizable modules can be mapped into an FPGA and run for
debugging.
• Unable to partition multiple clocks and reset trees.
• FPGA provides limited debug capability and visibility during single iteration
• hence multiple iterations may be required to narrow down to the
specific bug.
7 July 2008
14 March 2013 26
Conclusion
• Increasing complexity
• Number of CPUs and Ips
• Multiple clock domains, multiple power domains,
• Coherent shared memory structures
• Scalability issues
• Is dynamic enough
• Number of cycles
• Writing complex tests and checkers
• Consider formal
• Product = hardware + software
• Hardware/Software co-verification
• Need longer simulations
• Consider FPGA
7 July 2008
14 March 2013 27
Benefits of working with T&VS
Any Questions?
Benefits of working with T&VS
Faster time- Cost Savings
to-market
o “TVS engineers saved 2 months on our 12 month schedule,
and reduced our costs by 20%”
Improved Steve Neill, Managing Director of Infineon UK
quality/reduced
product risk Resource flexibility
o “TVS provided the OVM-compliant verification IP on
Lower schedule and to budget, and were flexible enough to
development
costs incorporate change requests.” Simon Knowles, VP for
Strategy & DXP Technology, Icera
Improved Increased quality of deliverables
product
features o “TVS helped us incorporate the VIP into our test
environment in good time for our tape-out and left us with
complete documentation to allow us to repeat the work
Trusted
partner should we ever need to on a future generation of our
technology.” Fred Homewood, CEO of Gnodal
7 July 2008
14 March 2013 28