Smarter EDA: Leveraging New Technologies for Product Verification

By Gabor Samu, IBM Spectrum Computing

May 21, 2019

There is perhaps no sector more competitive than the modern electronics industry. Macro-trends, including artificial intelligence, 5G, and the internet of things (IoT), continue to propel dramatic growth. According to IOT Analytics, the number of network-connected devices is expected to increase to over 34 billion by 2025 with the global IoT market topping US $1.5 trillion in revenue1. What most of these high-growth applications have in common is complexity. Many involve system-on-a-chip (SoC) designs with tens or even hundreds of millions of gates.

New challenges for device verification

Device verification is about making sure that designs execute correctly and reliably. When engineers change a design, they simulate the new behaviors in software to make sure that functionality does not “regress.” Taping out a design to create a photomask can cost millions, so it’s critical that designs be error-free before they are sent to a foundry. These regression tests and other verification activities can account for roughly 80% of the workload in EDA environments.

Verification engineers run multiple types of simulations, including gate-level simulations (GLS), register-transfer level (RTL) simulations and various analog simulations to maximize coverage and ensure that designs are error-free. Regression tests can take days even on large-scale HPC environments. The challenge for verification engineers is that device complexity is growing faster than their ability to simulate devices in software.

Discover how to empower the EDA designer at the DAC conference starting June 2nd in Las Vegas.

Moore’s law hits the wall

Historically, Moore’s Law helped engineers at least partially address this gap between verification requirements and capacity. In recent years, however, advances in processor performance have slowed as clock speeds come up against hard limits. While EDA managers could once count on performance increases with each generation of hardware, the latest generation of processors has barely moved the performance needle for many single-threaded tools.

Processors continue to get more powerful, but improvements come mostly from multi-core parallelism and multi-threading. Unfortuntely, many EDA tools are single-threaded and not easily parallelized.

Addressing the verification performance deficit

Finding ways to improve verification performance is an “all of the above” challenge. We can no longer rely on faster processors to bridge the gap, but advances continue to be made on multiple fronts. These include reusing IP that requires less rigorous verification, new multi-threaded simulators, and sophisticated tools better at managing verification coverage. IBM’s Spectrum Computing software has played a major role in helping continue to improve verification throughput despite stalling clock speeds. As examples:

  • Advances in IBM Spectrum LSF “short-job” scheduling performance helps get verification jobs on and off the server farm much more quickly.
  • Other IBM Spectrum LSF improvements such as job start-time prediction make verification engineers more productive by giving better visibility to when simulations will run on busy clusters.
  • IBM Spectrum LSF License Scheduler helps ensure that scarce EDA licenses are deployed optimally to minimize feature checkout time and improve regression throughput.
  • IBM Spectrum Scale helps organizations avoid filer “hot spots” that traditionally plague NFS-based EDA environments boosting verification performance and facilitating data sharing between sites .

Read also: Cognitive Computing Goes to Work

AI and Quantum Computing provide new opportunities

While the advances above all provide incremental improvements, solving the verification performance challenge is going to require serious breakthroughs. One promising approach is to augment EDA tools with machine learning. For example, when hit with a late stage bug, rather than spending days running regressions tests, many of which may be unnecessary, AI deep learning models may be able to predict which of the millions of tests are relevant to testing a fix to our problem. Such innovations can dramatically reduce regression runtime and avoid schedule risk.

Of course, achieving this requires collecting and storing enormous datasets generated by multiple tools across millions of simulation runs. This is where big data and analytics come in.

While it may seem that there is no holistic way of dealing with all of this complex data, this is exactly the type of problem that machine learning, and deep learning in particular, is especially good at solving. For many, it is more practical and cost efficient to use cloud-based services to augment the EDA simulation environment rather than running complex analytic applications on-premises.

Another promising approach is quantum computing. While early in its development, quantum computing is now available on real hardware via the cloud through IBM Q. This radically new type of computing shows promise for particular problems in EDA for which classical computing is ill-suited.

Read also: Improving chip yield rates with cognitive computing

Learn more at DAC

This year at the annual Design Automation Conference (DAC) conference starting June 2nd, in Las Vegas, Leon Stok, Vice President of Electronic Design Automation for IBM Systems will be sharing experiences at IBM using machine learning, cloud computing, and other techniques including quantum computing to improve the efficiency of large-scale chip design.

You may also be interested to hear John Cohn, IBM Fellow at the MIT-IBM Watson AI Lab, at his keynote address Prioritizing Play in an Automated Age Wednesday, June 5th at 8:45 AM.

Also, drop by the IBM’s booth (#1220) in the Design Infrastructure Alley to learn about valuable tools for EDA including IBM Spectrum LSF, IBM Spectrum Scale, IBM PowerAI Vision, and IBM Watson Machine Learning Accelerator. We hope to see you there. Following the conference, we’ll share highlights and presentations with the IBM Spectrum LSF User Community.


References:

[1] https://iot-analytics.com/state-of-the-iot-update-q1-q2-2018-number-of-iot-devices-now-7b/

Return to Solution Channel Homepage
HPCwire