+ All Categories
Home > Documents > Rethinking SoC Verification - Synopsys Paper March 2014 Rethinking SoC Verification Enabling...

Rethinking SoC Verification - Synopsys Paper March 2014 Rethinking SoC Verification Enabling...

Date post: 09-Apr-2018
Category:
Upload: vuongkiet
View: 221 times
Download: 4 times
Share this document with a friend
5
White Paper March 2014 Rethinking SoC Verification Enabling Next-Generation Productivity & Performance Rebecca Lipon Senior Product Marketing Manager, Synopsys Introduction The introduction of the iPhone in 2007 represented a fundamental shift in electronic system design: moving advanced processing power off of the desktop and into the hands of users everywhere, always. This shift has led to a revolution in mobile—the expansion into the Internet-of-Things, with wearables, connected automobiles and homes (See Figure 1). Verification Complexity Power Efficiency More Software Time-to-Market Multicore CPUs + Multicore Graphics + Multimedia + Connectivity + Sensors Mobile & Internet-of-Things Driving Growth Convergence SoC complexity New verification challenges Figure 1: Mobile and Internet-of-Things driving growth This revolution is causing profound technology challenges in the semiconductor industry. Each of these systems has connectivity demands requiring the use of standard protocols such as Bluetooth, USB, LTE, and MIPI. Users’ need for longer battery life requires ultra-low power design on every SoC. Innovation in user interface design is similarly creating challenges for the SoC: touch, temperature, and pressure sensors are now an expectation. Users’ desire for apps on their mobile phones also creates the requirement for software development in tandem with hardware development, so users will be able to access content the moment they purchase a new device. And these multi-use devices must be upgraded every few months to remain competitive in the mobile marketplace. In response, this is causing a revolution in design—integrating ever-greater system complexity onto a single chip. Today, a leading edge mobile-enabled electronic system is based on a SoC that contains more than a billion gates, at least 10 interface protocols, up to hundreds of IP blocks, power domains and clock domains, and millions of lines of code. This increase in SoC design complexity has created orders of magnitude greater
Transcript

White Paper

March 2014

Rethinking SoC VerificationEnabling Next-Generation Productivity & Performance

Rebecca Lipon

Senior Product

Marketing Manager,

Synopsys

IntroductionThe introduction of the iPhone in 2007 represented a fundamental shift in electronic system design: moving advanced processing power off of the desktop and into the hands of users everywhere, always. This shift has led to a revolution in mobile—the expansion into the Internet-of-Things, with wearables, connected automobiles and homes (See Figure 1).

Verification Complexity

Power Efficiency

More Software

Time-to-Market

Multicore CPUs +

Multicore Graphics +

Multimedia +

Connectivity +

Sensors

Mobile & Internet-of-Things Driving Growth Convergence SoC complexity New verification challenges

Figure 1: Mobile and Internet-of-Things driving growth

This revolution is causing profound technology challenges in the semiconductor industry. Each of these systems has connectivity demands requiring the use of standard protocols such as Bluetooth, USB, LTE, and MIPI. Users’ need for longer battery life requires ultra-low power design on every SoC. Innovation in user interface design is similarly creating challenges for the SoC: touch, temperature, and pressure sensors are now an expectation. Users’ desire for apps on their mobile phones also creates the requirement for software development in tandem with hardware development, so users will be able to access content the moment they purchase a new device. And these multi-use devices must be upgraded every few months to remain competitive in the mobile marketplace.

In response, this is causing a revolution in design—integrating ever-greater system complexity onto a single chip. Today, a leading edge mobile-enabled electronic system is based on a SoC that contains more than a billion gates, at least 10 interface protocols, up to hundreds of IP blocks, power domains and clock domains, and millions of lines of code. This increase in SoC design complexity has created orders of magnitude greater

2

challenges in SoC verification. Not only is there a need for sheer capacity in verification technology, but also the need to verify vastly more scenarios: Power management, analog components, device-level software, low power structural checks and much more. Companies have been leveraging many point tools to address these challenges, and they have also tried to plug holes in verification flows with disjoint flows. Because of the duplicate steps, incompatible databases and different debug environments, making disjoint flows work is extremely costly in time and resources, and has big impact on ease-of-use and productivity. Moreover, the results of point tools are difficult to integrate into standard sign-off flows. Building a comprehensive, unified and integrated verification environment is required for today’s revolutionary SoCs.

As a result, we are at an inflection point in this industry that calls for new, integrated verification solutions that will offer a fundamental shift forward in productivity, performance, capacity and functionality. Synopsys is meeting this demand with Verification Compiler™. Verification Compiler provides the software capabilities, technology, methodologies and VIP required for the functional verification of advanced SoC designs in one solution (See Figure 2).

Debug

Verification Compiler

Coverage

Static

FormalVIPSimulation

Figure 2: Verification Compiler

The first step in addressing productivity demands of these large SoCs is to provide the next generation of high performance and high capacity static and formal solutions to enable bug detection and prevention at a much earlier point of the design flow. Current static and formal technology is incapable of scaling to address the concerns of large scale SoCs. In the past these technologies were often deployed at block-level only, but given the complexity and size of current chips a new solution is entirely necessary. Verification Compiler provides next-generation static and formal technology delivering a 3-5X performance improvement and the capacity to analyze a complete SoC.

As important to addressing productivity as early bug detection and prevention with a high capacity solution is, those engines are far more powerful when integrated into a single verification solution. Engines that leverage a common front end compiler as well as common coverage and debug interfaces help reduce set up overhead and allow users to have much better visibility across all techniques applied in the verification process. Often to truly address a complex problem requires a combination of formal or static techniques as well as dynamic simulation approaches and debug visibility across the entire flow.

Let’s take the specific example of low power SoC verification. Low power verification has always required specifying intended power design, and then having all technologies in the flow adhere to and model that intention accurately. However, when the definition of power intent was first standardized, it did not take into account that the sources of a net might be from an analog component and not a purely digital one. In a SoC, a user touching a screen absolutely should be able to wake up a device, which requires that the power-on sequence for that device can take input from analog circuitry. This behavior has required new advances in simulation to enable accurate modeling.

Similarly, when a smartphone turns on, a complex sequence of events occurs through hundreds of IP blocks to ensure that the device wakes up properly. Traditionally, reset simulations are run at gate level to mimic the way that actual hardware resolves x states in order to ensure the device will operate properly. However, using gate-level simulations isn’t practical for complex SoCs, but RTL simulation can be overly optimistic with these x values and possibly mask bugs. With the increase in low power designs, the situation has only become worse since power-aware chips act as if they are in reset mode whenever they recover from power-shutoff and can get into even more complex states recovering from standby. Verification engineers need to regularly validate X-propagation issues particularly in low power designs, preferably at RTL for fastest simulation.

3

Conventional Native Low Power RTL simulation (Verilog semantics)

Top domain (always on)

0

Native Low Power RTL simulation with X-Prop

Missing ISO

ISO

Corruption value from conventional Native LowPower RTL simulation showing missing ISO policy is masked by RTL semantics

Conventional Native Low Power RTL simulation-inferred isolation is properly blocking corruptionvalue from OFF domain

Corruption value from Native Low Power RTLsimulation with X-Prop showing missing ISOpolicy is properly propagated by X-Prop semantics

Native Low Power RTL simulation-inferred isolationis properly blocking corruption value fromOFF domain

Top domain (always on)

0

Missing ISO

ISO

0

X1

X

X

0

XX

X

X

Powerdomain 1

OFF

Powerdomain 2

ON

Powerdomain 1

OFF

Powerdomain 2

ON

Powermanagement unit

Powermanagement unit

Figure 3: Native Low Power RTL simulation with and without X-Prop

In Verification Compiler X-propagation simulation can be run simultaneously in a native low power and analog mixed signal cosimulation environment. All of them can share coverage data, and be debugged simultaneously, allowing for far more robust verification to occur. A common understanding of power intent, auto generated low power assertions, advanced modeling techniques like X-propagation analysis, and other valuable simulation data presents a better low power debug view of the design (See Figure 3).

Figure 4: Find source of X with Power-Aware Debug

Another area of particular pain in SoC verification can be in clock domain crossing (CDC) validation. With the increase in low power design techniques it is entirely possible to have multi-voltage, adaptive frequency designs making the task of ensuring that clocks are stable when logic requires them incredibly daunting. For example, every element at the boundary of every power domain should have its isolation enabled either in the source domain or synchronized to the source domain (See Figure 4). This requirement means that static tools traditionally used for CDC also have to share data with low power static-checking tools to achieve accurate validation. Adding to that, to truly validate at the SoC level, these static tools need far greater capacity than they have had in the past. The system has to be checked holistically, and the tools have to share data.

4

CDCchecking

Formal propertychecking

Lowpower

AdvancedLint

Commandinterface

GUI +schematics

Customreports

FSManalysis

Optimized design databases

LPDB

ClockDB

ExtendedDB

TestDB

Unified logic database

Unified hardware inferencing

Verilog, VHDL, SV, .lib, UPF, SDC, …

Sav

e/re

sto

reT

CL

inte

rfac

e

Figure 5: Next-generation static and formal verification

Formal CDC checking

Static (structural) CDC checking

Dataloss

Datastability

FIFO Handshake

CDC – Jitter simulation

clk2

F2 F3

clk1

F1D

clk2

F5 F6

clk1

F4D

clk2

F2 F3

clk1

F1D

Figure 6: Clock domain crossing (CDC) validation needs static, formal verfication and simulation

Verification Compiler offers a high-capacity, integrated front end compiler with common coverage that allows the full chip to be verified, and next-generation static and formal applications to share data leading to comprehensive, robust validation for SoCs (See Figure 5). All of these activities leverage the same configuration as simulation and other verification activities, so these flows are completely contiguous: Verification Compiler truly provides one verification closure flow. One compiles the design once targeting static and formal checks to help identify design bugs related to CDC, low power, connectivity, and other scenarios earlier, and then one can, with the exact same setup in the exact same solution, run dynamic simulations on the scenarios that truly require them while leveraging common coverage and debug across all engines. This integration means verification engineers no longer have to be experts at every static and formal check that could be run—they can easily run these checks without having to set up a new flow and they work together to catch scenarios that never could have been tested in a disjoint point tool solution (See Figure 6).

Synopsys, Inc. • 700 East Middlefield Road • Mountain View, CA 94043 • www.synopsys.com

©2014 Synopsys, Inc. All rights reserved. Synopsys is a trademark of Synopsys, Inc. in the United States and other countries. A list of Synopsys trademarks is available at http://www.synopsys.com/copyright.html . All other names mentioned herein are trademarks or registered trademarks of their respective owners.

While simulators have provided debugging solutions for many years, the ability to interactively debug simulation results as it runs helps identify issues faster and more easily than post-process debugging ever could. Root-causing constraint conflicts and performing what-if analysis of different possible values for a random variable can all be done without having to leave the debug interface or recompile the design. Tracing unknown values in power-aware simulation with X-propagation analysis can be done interactively through the power-on sequence until this tricky analysis is validated without ever having to run a simulation to completion. Pin-pointing testbench bugs can be done far more effectively through transaction-level visualization of dynamic objects in the waveform window while stepping through the simulation cycle-by-cycle. RTL can also be verified simultaneously with object code running on the embedded processor interactively allowing users to verify both the processor and the hardware simultaneously. Having a common front end, native traversal of the design hierarchy, and robust compression of dumping data built into Verification Compiler allows for far higher capacity designs to be debugged efficiently. Integration with static technology and awareness of VIP in the debugger allows a user to abstract the visualization of their chip to a higher-level. This enables far more efficient protocol-level analysis of issues. Common, high-capacity, interactive debug is the only way to effectively visualize and ensure the validity of large SoCs and this is all available with Verification Compiler (See Figure 7).

Figure 7: Verdi Protocol Analyzer and Low Power Debug views

A deep integration between VIP and the simulation engine can also greatly improve productivity. Verification Compiler’s constraint engine is tuned for optimal performance with its VIP library. It has integrated debug solutions for VIP so one can do protocol-level analysis and transaction-based analysis with the rest of the testbench. Verification plans for protocol-level coverage are provided for all titles, easy to incorporate as subplans into the overall project tracking process. The VIP library also comes enabled for pre-compilation, meaning it can always be compiled separately, and linked as a pre-compiled .so file, saving compilation time at the SoC level, and reducing disk space.

Verification Compiler allows users to find bugs earlier, increase their reuse of IP across projects, and run faster and smarter simulations all in one environment. Verification Compiler provides the right engine for each challenge with no additional setup time; furthermore it provides consistent debug and coverage visualization across all engines, greatly improving productivity in analyzing results and identifying design issues. Verification Compiler provides best-in-class simulation and an intuitive, robust, multi-platform debug engine, with next-generation static and formal solutions, a comprehensive library of optimized VIP, and full low power verification and debug capabilities.

Verification Compiler enables concurrent verification allowing its composite engines to be used concurrently and independently greatly enhancing individual and organizational productivity. The common front-end compiler, common debug interface, and common coverage database across engines increases consistency in setup and portability of code across tools, enhances visualization by providing a common interface for all flows, and allows projects to be comprehensively tracked regardless of the verification technique deployed. Verification Compiler embodies the fundamental rethinking needed to address the rising SoC challenges our industry is facing: it is the method by which organizations can effectively verify today’s SoCs.

03/14.RD.CS4139.


Recommended