Quantcast
Channel: System-Level Design » systems engineering
Viewing all articles
Browse latest Browse all 10

Next Steps In Verification IP

$
0
0

By Ann Steffora Mutschler

With the cost of failure at an astronomical high, the last thing chip designers want to worry about is the physical IP they will use to build their SoC.

In addition to less willingness on the customer’s behalf to take risks, complexity and economics have driven the need for more off-the-shelf IP and a corresponding rise in interest in verification IP. Compounding matters, IP investments are being stretched out for longer periods of time than in the past. That has made verification IP even more popular. Confidence in IP is critical, and this comes through a comprehensive IP validation discipline on the part of the IP provider.

However, the maturation of any method or tools means new focus on them, and so far the design industry has not even settled on what the optimal methodology should be for IP verification.

As a starting point, it helps to define types of verification. First, there is an intense level of unit-level verification where compliance to the relevant protocols is focused on and where the functionality of the block itself is detailed, said Mark Gogolewski, CTO at Denali Software. In addition, there is a separate step during which the subsystem or the system is constructed, with verification at this point being very different.

“For a time, there was a lot of IP verification when you had a bigger subsystem, but these days, the IP gets completely wrung out at the unit level and then when you construct the system, you are focused much more on connectivity and dataflow and how the system interacts,” he explained.

“If you are testing an IP block, there are two major dimensions of verification challenge. One is the protocols that are relevant to the IP block, and the other is the functionality, which is making sure the microarchitecture that was used to design the IP was correctly implemented,” Gogolewski said. “Correct” can have many meanings in terms of correct function and leading off performance objectives of that particular block of IP, he noted.

“Verification is all about observability and control. You need to make sure you are observing every aspect of the protocol, but then you have to give the customer control. One dimension for memories is giving easy control of the data space, and another is error injection and that’s another level of investment has to be made,” he added.

Carl Ruggiero president and CEO of Trilinear Technologies, agrees that common definitions of IP verification need to be established. “Depending on [a customer’s] point of view, everyone has a different idea of what verification ought to be, and that’s really making our job very challenging. Everybody says they want verification, but right now there is really no defined vocabulary for it. You cannot call it gates and flops like you can on the design side. People want to talk about coverage and percent of coverage, but at the same time coverage is very subjective. You can get 100% coverage with five coverage points. Therefore, it is hard to say what good coverage is because if you have 300 coverage points, you might be missing that 301st, which is the critical one. How do we go about putting metrics on it? How do we define the vocabulary so we can all speak the same language? We struggle with this on a daily basis.”

In an effort to start out clearly with customers, Ruggiero says Trilinear talks about its verification in terms of functional coverage. “We talk about the actual things that we set out to do. We talk about garnering 100% functional coverage. While we don’t say that we’ve tested every ad nauseum combination of things, that for the things that our software drivers and reference drivers, the functions that are listed in the data sheet and in the specification, those are the ones we’ve tested to.”

IP Verification Challenges

Ken Brock, director of physical IP marketing at Virage Logic, said that when it comes to IP validation specifically for on-chip physical IP, challenges and solutions can include taking a standard cell library of more than 1,600 unique circuits and running them through one of several EDA vendors synthesis tools, running them again through the same or different EDA vendors’ physical synthesis/place and route tools and have them all work perfectly; taking a memory compiler with a dozen different knobs and switches and producing a fully functional memory IP over the number of words and bits with multiple aspect ratios, test options and power optimization configurations; mixing them together with other IP on an SoC; and doing all of these things over the full speed, voltage, temperature and process variability extremes of a specific leading edge silicon process.

He noted that the IP validation process requires a rigorous discipline, which includes unit validation, integration testing, platform validation and silicon validation.

Indeed, IP giant ARM is pursuing just that. Tom Lantzsch, VP of ARM’s Physical IP Division noted, “We spend a lot more time with the EDA partners integrating our IP under their flows much earlier and having them leverage it and test it themselves. It is a constant activity because when we do our verification, unlike an internal supplier, which probably has a limited EDA flow, and maybe even a limited customer set within their company, we have to be much more systematic and have to create a verification environment that supports us for multiple years.”

The Cost of Providing Verified IP

Whether making an investment into a new technology for entrepreneurial reasons or encouraged by major customers, the latter of which Denali did with its entry into the PCI Express arena, making it pay off is no small task both to the customer and for the IP provider.

As Gogolewski explained, with the company’s entry into PCI Express, “the world got a lot more complicated because it is extremely configurable, programmable and complicated. What we mean by configurable is that before you even put a design in silicon there are many choices. We have a couple hundred choices in our configuration spec just to correctly specify what that particular device even looks like at a specification level. It is programmable because it has all sorts of register settings that have to be set correctly and which can change the behavior of the device. And then it’s just complicated—our engineers had to become experts on two to three thousand pages of documentation. We had to make sure all the functionality was in there with the flexibility and programmability; we had to make sure all of those thousands of pages of spec became error checks and assertions. And then the way that [PCI Express] protocol works, your IP has to both handle correct functionality and incorrect functionality and respond properly. So there was a multitude of error injection that we had to make available to our customers as well as our own design team to make sure that they could inject all these levels of errors and validate whether or not their design caught it correctly.”

To deliver this level of backup data to customers for PCI Express, Denali estimates the extra engineering effort required is equal to approximately 70 to 75 man-years of effort over 7 years, with about 550,000 lines of new code created, not including the company’s Purespec library code.

The IP Verification Horizon

In the next phase of IP verification, one thing is for sure—there will be more of it provided by third parties.

“We’re at a tipping point from ‘make unless you have to buy’ to ‘buy unless you have to make,’ and the current economic climate is going to accelerate that. Basically the fundamental premise of third party IP is that if it is a ubiquitous problem and it is solved well, then the market is overall more efficient and better off when a third party solves it, rather than each customer solving it on its own,” Gogolewski said.

He also sees more IP verification moving toward third party IP vendors, even though there will always be customers that will create their own IP to maintain their place on the very bleeding edge of design. And he believes coverage-centric verification will be embraced. “It used to be something that leading-edge design teams would use, but now it is becoming ubiquitous,” he said.


Viewing all articles
Browse latest Browse all 10

Trending Articles