Looking for an AI refresher to beat the summer heat? In this Q&A, Hyperion Research Senior Adviser Steve Conway surveys the AI and analytics landscape in a time of intense activity and financial backing. Just last week, the National Science Foundation (NSF) announced it had expanded the National AI Research Institutes program to 40 states (and the District of Columbia) as part of a combined $220 million investment. What is all this attention and investment leading up to? What is significant right now? What’s the HPC connection? Keep reading for insights into the questions everyone’s asking.
HPCwire: How would you describe the status of AI today?
Conway: AI is at an early developmental stage and is already very useful. The mainstream AI market is heavily exploiting early AI for narrow tasks that mimic a single, isolated human ability, especially visual or auditory understanding, for everything from Siri and Alexa to reading MRIs with superhuman ability.
HPCwire: What’s the eventual goal for AI?
Conway: The goal over time is to advance toward artificial general intelligence (AGI), where AI machines are versatile experiential learners and can be trusted to make difficult decisions in real time, including life-and-death decisions in medicine and driving situations. Experts debate what it will take to get there and whether that will happen. Hyperion Research asked noted AI experts around the world about this in a recent study. The sizeable group who believe AGI will happen said, on average, it will take 87 years. There was an outlier at 150 years. But whether or not it happens, AGI is an important aspirational goal to work toward.
HPCwire: What role does HPC play in AI?
Conway: HPC is nearly indispensable at the forefront of AI research and development today, for newer, economically important use cases as well as established scientific and engineering applications. One reason why HPC is attracting more attention lately is that it is showing where the larger, mainstream AI market is likely headed in the future. The biggest gifts HPC is giving to that market are 40-plus years of experience with parallelism and the related abilities to process and move data quickly, on premises and in more highly distributed computing environments such as clouds and other hyperscale environments. The HPC community is also an important incubator for applying heterogeneous architectures to the growing number of heterogeneous workflows in the public and private sectors.
“The biggest gifts HPC is giving to that market are 40-plus years of experience with parallelism and the related abilities to process and move data quickly, on premises and in more highly distributed computing environments such as clouds and other hyperscale environments.”
HPCwire: Reversing that question, what role does AI play in HPC?
Conway: A recent Hyperion Research study showed that nearly all HPC sites around the world are now exploiting AI to some extent. Mostly, they’re using AI to accelerate established simulation codes, for example by identifying areas of the problem space that can be safely ignored. In cases where the problem space is an extremely sparse matrix, this heuristic approach can be especially helpful. HPC-enabled AI is also used for pre- and post-processing of data.
HPCwire: What’s the relationship between analytics and simulation in HPC-enabled AI?
Conway: Some applications use analytics alone, but many HPC-enabled AI applications benefit from both data analytics and simulation methodologies. Simulation isn’t becoming less important with the rise of AI. This frequent pairing of simulation and analytics says that HPC system designs need to be compute-friendly and data-friendly. Newer designs are starting to reverse the increasing compute-centrism of recent decades and establish a better balance.
HPCwire: You mentioned “newer, economically important use cases” for HPC-enabled AI. Can you say more about those?
Conway: A few years ago, anecdotal evidence led Hyperion Research to compile a list of repetitive AI use cases that vendors could begin to pursue as emerging HPC market segments: precision medicine, automated driving systems, fraud and anomaly detection, business intelligence, affinity marketing, and IoT/smart cities/edge computing. Hyperion Research’s recently completed multi-client study of the worldwide HPC market found that 80 percent of the surveyed HPC sites already use one or more of these applications. Some of this is to support established HPC applications in HPC datacenters, but a surprising portion is to support business operations in enterprise datacenters. This confirmed the growth of a trend we’ve been tracking for a decade, where enterprise data analytics requirements are pushing up into the HPC competency space.
HPCwire: How is AI related to HPDA?
Conway: Hyperion Research defines high performance data analysis, HPDA, as data-intensive computing that uses HPC resources, whether for simulation or analytics. AI is the HPDA subset that involves data analytics, whether learning models or other analytics methods.
HPCwire: What about AI and cloud computing? Edge computing?
Conway: Our studies show that 20 percent of all HPC workloads are being run in third-party clouds and this number is growing, mostly not at the expense of on-premises computing. AI methods supporting HPC workloads are about as common in cloud settings as on premises. HPC also has a crucial role to play in the important subset of edge computing applications that need wide-area analysis and control, as opposed to just local responsiveness at the edge. A large portion of the one-time Top500-leading Tianhe-1a supercomputer, for example, was dedicated to urban traffic management in Guangzhou. Some respected thinkers believe HPC will be the glue that unifies the emerging global IT infrastructure, from edge to exascale.
HPCwire: What’s needed to move things forward? Are people working on these things?
Conway: Things are definitely moving forward, thanks in no small part to researchers advancing AI practices in the worldwide HPC community, but there are important challenges that are being worked on. They include making the operations of multilayered neural networks explainable and trustworthy, ramping up the availability of realistic synthetic data to address the shortage of useful real-world data in some domains, and advancing multimodal AI that can concurrently mimic more than one human sense. A more profound challenge concerns AI methodologies and the decades-old, rising debate between experts who believe learning models will be adequate for achieving AGI and those who say learning models mimic only high-level, abstract functions of intelligence and need to be augmented with methods that mirror our brains by directly experiencing the natural world.
HPCwire: The NSF recently announced it is expanding the National AI Research Institutes program to 40 states (and the District of Columbia) as part of a $220 million investment. That is one of many state-sponsored AI projects being launched around the world. Where does public investment fit in your view of AI?
Conway: The U.S., China, Europe and Japan all have government-funded initiatives aimed at increasing their AI capabilities as a prerequisite for scientific-engineering progress and economic competitiveness. They have analogous initiatives in HPC, which has already proven its ability to accelerate scientific and industrial innovation. For AI, HPC and other technologies with strong transformational potential, government investment is crucial for laying out national-regional goals and motivating progress toward those goals, especially when the technology is in an early and uncertain stage, as is certainly true of AI.
HPCwire: Stephen Hawking famously said, “AI is likely to be either the best or worst thing to happen to humanity.” Care to comment?
Conway: Who am I to question Dr. Hawking? I think there’s a danger that AI’s now-unstoppable momentum could overwhelm crucial ethical considerations, but especially in the past two years or so, more attention is being paid to the ethical ramifications of AI progress. With AGI predicted to be a century or so in the future at best, humanity has some time to wrestle with this.
Bio: Steve Conway is Senior Adviser of HPC Market Dynamics at Hyperion Research. Conway directs research related to the worldwide market for high performance computing. He also leads Hyperion Research’s practice in high performance data analysis (big data needing HPC).