When I started in the EDA industry, there was only one kind of verification that designers had to deal with -- functional verification. That also included some aspects of timing verification because simulators had built-in min/max timing abilities and could check a device’s setup and hold times. The market was primarily a board market because chip design was still very much in its infancy.
It was all fairly simple, and verification was far from being a bottleneck in the process.
Today, verification is becoming one of the most expensive aspects of the design process. I wonder if anyone has actually added the total costs of verification over the entire concept-to-product lifecycle. In the functional realm, it's estimated that verification accounts for somewhere between 50 to 70 percent of the total design cost, but this does not take into account all of the other types of verification that must now be performed, which includes:
- Timing verification
- Physical verification
- High-speed verification of interfaces such as DDR, SerDes, thermal, noise, and IR drop.
Even functional verification is reaching a crisis level: It's not possible to perform enough simulation to get to a reasonable level of confidence that the device will work. Part of this is due to the inefficiencies created by the tools themselves because they considered compute power to be free, and the only thing that should be optimized was human time.
Many companies have spent millions of dollars on emulators, and other forms of simulation acceleration, in order to try and get longer verification runs or more tests run, but even they are not able to provide enough horsepower.
Testing using constrained, random test-pattern generation may be one of the largest follies of EDA history. Today, an additional layer of abstraction is almost being forced upon the industry to enable more verification. Hopefully, this will stop the same mistakes from coming up again and again.
One piece of good news is that the race for higher clock frequencies has subsided. When single-processor designs were the rage, every product iteration required faster processors. This often required increasingly complex designs.
Eventually, the tradeoff failed, and we have made the transition to multi-processor solutions. This has not stopped interfaces and communications from going to higher frequencies and data rates, but maybe we'll see a leveling of verification costs with some areas requiring additional attention, while others will see falling costs.