By Clive “Max” Maxfield
System Level Design (SLD) is one of those terms that means different things to different people. For some, SLD is understood to refer to the process of capturing (and verifying) a complex ASIC/ASSP/SoC design at a high level of abstraction. Another term that we often hear in this regard is Electronic System Level (ESL) design.
Trust me, you don’t want to get me talking about ESL. A couple of years ago I wrote an opinion piece for a DAC newsletter entitled “What the hell is ESL?” You would be amazed at the number of folks who e-mailed me after reading that piece saying, “Thank goodness, I thought it was only me who was confused.” But we digress…
I absolutely agree that SoCs are going to get bigger and better (and scarier), and that companies like Cadence, Mentor, Magma, and Synopsys – along with all of the smaller EDA companies whose mission it is to create SoC design and verification tools – are going to continue to astound us with their latest and greatest offerings.
But I also have some problems with this. For example, very few products these days can justify the tens of millions of dollars it can cost to design and deploy a custom silicon chip. Another consideration is that when you create an SoC, your algorithms are effectively “frozen in silicon,” which can be extremely awkward when standards and protocols are continually evolving all around us.
One alternative is to use a Field Programmable Gate Array (FPGA) as the implementation platform for the design. Of course we all know that early FPGAs were logic limited, power-hungry, and generally not very interesting. But times have changed. Some FPGA families boast incredibly high capacity and high performance; others offer extremely low power; still others provide mixed-signal capabilities; and all offer configurable fabric that can be adapted to whatever tasks are required.
This doesn’t mean that FPGAs are suitable for every application, but they are currently appearing in all sorts of systems, from handheld, battery-powered units to supercomputers whose performance make your eyes water.
Another consideration is that future generations of electronic products will not be used as standalone devices. Instead, they will be intelligent elements in an interconnected ecosystem. As one rather obvious example, look at Apple’s iPod. When considered in isolation, this is only a vaguely interesting media-playing device. What has made the iPod so widely successful—especially when compared to its competition—is its associated media purchase and download ecosystems in the form of iTunes.
This type of “big picture” view increasingly will apply to products of all shapes and sizes. Consider a company that creates residential air conditioning units for example. Customers are certainly going to be interested to hear about new units that are more efficient, quieter and more powerful, but only to a point. What will really interest them is when their air conditioning system is augmented with connectivity features that allow them to monitor and interact with the system from anywhere in the world. Imagine, for example, returning from vacation, realizing that you have arrived in the middle of an unexpected heat wave, and calling your air conditioner from your car and instructing it to start cooling your home in preparation for your arrival.
Similarly, imagine a smart air conditioner that can communicate with its manufacturer and/or your service provider. If your air conditioner notices an unexpected vibration or a loss in performance, it could automatically log a request for a service before system failure occurs. (This would have been really useful at my house a couple of summers ago.)
What all this means is that “System Level Design” cannot simply focus on the development of a silicon chip in isolation. Instead, we have to work from the top down at a very high level of abstraction, starting by considering the user “experience,” the user interface, and the way in which this product is going to interact with the outside world. And only then should we actually start to think about the underlying implementation. Put another way, it’s important to decide just what it is we actually want to do, and then we can decide how to go about doing it.
The majority of today’s EDA environments are ferociously complicated, not the least that they require their users to learn special languages such as VHDL and Verilog. But a lot of folks who have really good ideas don’t know these languages and they don’t think like hardware design engineers. Wouldn’t it be better to make the tools more intelligent so that they can understand the languages favored by different users, such as C/C++, Java, Python, and so forth. I know there are some interesting C/C++ synthesis tools around, but a lot more could be done in this area.
As a somewhat related topic, it’s no longer sufficient to design the various parts of a product in isolation. That includes the FPGA, the circuit board on which it rides, the enclosure in which everything resides, and the firmware and software that run on the device. Instead, what is required is a unified environment in which everything can be developed and verified in the context of everything else.
Yes, the “big boys” in EDA have environments like this … but have you actually tried to use one? It’s hard enough to gain expertise in even a small portion of one of these environments. Running the entire thing with only a couple of people is well-nigh impossible. And then there’s the cost of all these tools.
The reason all of this is so important is that we are increasingly relegating product designs to fewer and fewer people that require incredible levels of training and expertise. This may work for the most complex SoC devices, but it is not a good way to go for the majority of products. What we need are solutions that will unleash the creative and innovative potential of a wide range of users. Instead of leaving things to technological experts, we need to empower new waves of users who can conceive world-changing ideas and products.