Michael Sanie, Director of verification Product Marketing at Synopsys,  explains why it’s time for the next major shift in verification technology to bring about an order-of-magnitude increase to productivity levels

We have witnessed two major shifts in verification during the past two decades. During both, design teams have been able to rise to the challenge of verifying the most complex state-of-the-art designs by applying innovative verification technology.

The first major shift took place in the 1990s. At that time, a state-of-the-art ASIC consisted of about 5 million gates in 1?m-0.5?m process technology. The computing industry was driving the complexity curve in its quest to produce more powerful CPUs (Figure 1).

During the 1990s, design teams were transitioning from gate-level design to the use of hardware description  languages (HDLs). This enabled design teams to tackle even more complex designs. The main method for verification at this time was HDL simulation.

Using HDLs, design teams were able to design larger and more complex chips efficiently, and scale their design efforts. But the efficiency and scalability benefits that HDLs brought to the design process were not extended to the simulation efforts.

The gap between design and verification scalability continued to grow because HDL simulation was not able to keep up with the rise in design complexity and size. The ‘simulation productivity gap’ emerged as the most pressing challenge in verification.

A comparable shift also occurred in the 2000s. In the early 2000s, design complexity, now being driven mostly with networking applications, reached higher levels. Design teams used more and more IP as ASICs became increasingly complex, reaching gate counts of 10 million or more.

Verification teams looked beyond simulation and started to turn to more sophisticated technologies, including the use of advanced testbenches,  constrained random approaches and assertions as they worked towards achieving higher levels of verification coverage.

While these new verification technologies existed as point tools, verification teams had to put in significant effort to make them work together, and it became increasingly difficult to create scalable verification solutions to address their most complex designs.

As the designs’ sizes and complexities continued to grow, the environments needed to manage the verification of these designs were once again no longer scalable and efficient.

To address this new ‘verification productivity gap’ Synopsys introduced its SystemVerilog and advanced testbench methodologies. In addition, introducing native testbench technology enabled design teams to combine and integrate several verification approaches around SystemVerilog.

Several years on we are now  witnessing another substantial shift in the industry’s design and verification needs. Today’s requirements are being driven by changes in the SoC design process and the continued growth in the complexity.

Sub-32 nanometer (nm), 100 million+ gate designs characterise today’s SoC devices. In order to meet project schedules, design teams are making extensive use of IP.

Today’s convergent consumer products demand SoCs which integrate applications processors along with several other functions and extensive support for software applications.

Market trends drive technology

Convergence is the key market trend for today’s devices. Convergence products, such as smart phones, tablets, and other advanced consumer devices, combine several key technologies within the same device. They typically incorporate multicore CPUs supporting multiple interface protocols – upwards of ten. They require long battery life, advanced software features, and short time-to-market.

Design complexity is increasing because of the need to achieve low power and some designs now incorporate more than 20 voltage domains.

For many consumer products, software is now the key to SoC differentiation. It should be no surprise, then, that according to IBS Research, 25  percent of the value of today’s SoCs is in the software. This is up from just 4 percent in the early 2000s.

It now takes significant investment to develop an advanced SoC. Companies are spending upwards of $100 million to produce the latest chip designs, with the majority of that spend directed at infrastructure and engineering costs.

The size of the verification team and effort typically outweighs that of the design by two-to-one. It is no surprise that businesses are focusing on verification productivity like never before.

In an attempt to keep up with large verification requirements, compute farms have doubled in size over recent years, verification teams have become twice as large as their design counterparts, and the debug process now accounts for 35 percent of the entire verification effort.

Verification teams do embrace enhanced feature sets, improved simulation performance and more efficient use of memory offered by verification providers. Yet, incremental improvements to today’s tools will not be sufficient to deliver an order of magnitude boost to verification productivity given the complexity of today’s designs. Instead, verification teams need innovations in verification.

It is apparent that what verification teams need today are innovations that focus on key productivity bottlenecks.

These innovations need to deliver significant improvements in performance and capacity as well as superior, more intuitive debug that enables engineers to quickly analyse vast amounts of data and find design bugs.

Comprehensive, proven verification IP that is fast, efficient, and timely with innovative low-power verification solutions is required along with hardware-software co-verification solutions that allow software teams to develop code alongside the hardware. This will enable validation of the entire system functionality and performance.

Solving these difficult problems is a significant growth opportunity where innovative approaches will be at the heart of the next major shift in verification.

Synopsys

www.synopsys.com