IDC: The Changing Face of HPC

By John Russell

July 16, 2015

At IDC’s annual ISC breakfast there was a good deal more than market update numbers although there were plenty of those: “We try to track every server sold, every quarter, worldwide,” said Earl Joseph, IDC program vice president and executive director HPC User Forum. Perhaps more revealing and as important this year was IDC’s unveiling of significant changes in how it is characterizing and measuring the HPC world.

For example, GPU and accelerator tracking has been promoted to a formal activity. But first the top line: The HPC market is growing after a period of sluggishness. It’s now around $10 billion (servers) and IDC expects it to grow substantially this year and steadily through 2019. Also, the collision of HPC and big data has perhaps been bigger than expected and prompted a change in IDC’s data gathering efforts.

“Lately we found that the vendors are having great difficulty determining how many systems go into HPC versus other areas so we’ve had to switch our data collection to very large scale surveys of buyers and users. Last year we exceeded 40,000 surveys and this year it will probably be on the same order of magnitude,” said Joseph.

“One thing I want to mention is about six months ago, with all the data collection we were doing and comparing things, some numbers just stopped adding up and what we found was that the financial sector has been growing faster than we thought and has become much larger over the last two years. In 2010 it didn’t recover much but then went into a hyper growth mode so you will see us restate our numbers for the last two years on the financial sector [up] on the order of at least a 50 percent,” the analyst continued.

For years, IDC has divvied up HPC into four categories: supercomputer; divisional; departmental; and workgroup. Those numbers are shown below.

IDC.ISC.Server Market

However sales for the top ten systems skew cumulative data so much that IDC is changing the way it characterizes the market.

“Except for the top ten systems the whole market grew dramatically. This is something we’ve struggled with in the past couple of years. The plan now is to introduce a fifth competitive segment all round the top ten. We’re not sure what we are going to call it, maybe something like leadership computers, but the market dynamics around those top 5-10 machines are totally different,” said Joseph. The report card and forecast the broader market are shown below.

IDC.ISC.Broad Market

You can see that storage remains hot and is expected to grow faster than other segments; that is being fueled by more data collection and more data analysis. IDC has labeled the fragmented storage market as something of a Wild West, although big players are now turning their attention to that market. IDC suggests similar dynamics in the interconnect space. Middleware is also being watched carefully not least because one would expect increased tool buying to accompany any large-scale movement to upgrade software; the latter of course is seen as a growing pain point in HPC.

Well-noted was the much discussed effect of IBM’s sale of its x86 business to Lenovo. The table below on Q1 2015 sales starkly portrays IBM’s tumble. Joseph commented: “The industry for the last decade was basically tied with two vendors: HP and IBM had 30-34 percent each. Then you had Dell at half of that. All the other vendors had a couple percent or lower. The market now is completely different. HP is the market leader; it’s on the order of a third of the market. We were expecting to see three vendors around 15 percent and the shock here is the IBM decline was higher than expected.”

Joseph was quick to add caution in reading Cray, SGI and the smaller vendors’ one-quarter numbers: “Cray’s going to have a fourth quarter that is three times their market share and so you really have to average the small vendors.” 2015’s Q1 showed ten percent growth relative to 2014 Q1 (See, IDC Says Lenovo Strengthens, IBM Stumbles in Q1 Ranking).

IDC.ISC.Vendor.RevenueTo a fair degree, ongoing market dynamics and IDC’s responses to them were as interesting as the sales numbers. Big data continues to expand and transform the HPC world; cloud adoption for HPC purposes is probably higher than you thought; nascent HPC ROI models are showing dramatic value; and vendor shuffling, perhaps not unexpected following IBM’s x86 business sale, continues. A plus for IBM is emerging traction in the market for OpenPOWER which now has nearly 150 members; over time IBM expects to regain market share in the technical server market along with other members of OpenPOWER.

Among the top HPC trends and watch areas cited by IDC are:

  • 2014 was soft but 2015 won’t be.
  • Big data combined with HPC is creating new solutions.
  • Software issues continue to grow.
  • GPUs & accelerators extend their impact.
  • Non-x86 processors could alter the landscape.
  • China looms large(r).
  • Growing influence of the datacenter in IT food chain.
  • HPC in the cloud gaining traction.

The cloud numbers are interesting. IDC surveyed 157 HPC sites on their use of clouds and roughly 25 percent reported they were using clouds and those who use clouds reported that nearly a third (31.2 percent) of their workloads were run on clouds.

Bob Sorensen, research vice president in IDC’s High Performance Computing group, said, “I should add the caution that some of what they are uploading are what we would consider the low hanging fruit, some of the embarrassingly parallel applications that don’t require very specific architecture. So it’s still early but we expect more sophisticated applications [to be run in the cloud] as time progresses.”

Perhaps the most transformative trend explored is big data convergence with HPC. Just what that means varies widely depending on which vendor is talking, influenced in no small measure but the type of technology they have to sell.

Interestingly, IDC is changing how it examines HPDA activities and introduced four new industry/application workflows including:

  • Fraud and anomaly detection. This “horizontal” workload segment centers around identifying harmful or potentially harmful patterns and causes using graph analysis, semantic analysis, or other high performance analytics techniques.
  • Marketing. This segment covers the use of HPDA to promote products or services, typically using complex algorithms to discern potential customers’ demographics, buying preferences and habits.
  • Business intelligence. The workload segment uses HPDA to identify opportunities to advance the market position and competitiveness of businesses, by better understanding themselves, their competitors, and the evolving dynamics of the markets they participate in.
  • Other Commercial HPDA. This catchall segment includes all commercial HPDA workloads other than the three just described. Over time, IDC expects some of these workloads to become significant enough to split out, i.e. the use of HPDA to manage large IT infrastructures, and Internet-of-Things (IoT) infrastructures.

(Note: financial, classified buyers, etc. will continue to be listed under the existing IDC segments)

Notably, “HPDA adoption has changed over time. It’s made its way into the scientific community and basically reached the point where it stands almost shoulder to shoulder with traditional modeling and simulation,” said Sorensen. Right now there are few rules, and IDC’s advice is to ‘embrace the chaos.’ There is so far no single best or even standard HPDA solution.

Joseph added that in the last six months IDC has conducted seven HPDA surveys, “We’ve studied everything from what are the underlying applications, what are the algorithms, where are the benchmarks, how do you evaluate a good HPDA system and solution, whether a distributed database or single database is better, and what hardware architecture do vendors plan to use to address the HPDA space.”

IDC.ISC.HPDS DiagramLots of questions and opportunities think IDC and virtually everyone else. In a survey of 128 HPC sites about 23 percent reported use of an HPC system for HPDA purposes.

One intriguing topic tackled is HPC ROI measurement. IDC has been developing economic models to try to unravel that question for some time, mostly driven by a contract with DOE. That three-year effort will be likely extended another 6-9 years, said Joseph.

In simple terms, the model input is HPC investment and the output is revenue growth, profit, and job creation. The early work seems exceedingly positive. IDC reported that for every $1 invested in HPC $356 in revenue and $38 in profit were generated – impressive if accurate.

Many more topics were touched upon and IDC is making the report available to all of the session attendees. This was the first year the event was part of ISC’s official agenda. Joseph reviewed HPC User Forum activities and said the group now planned to make all six year’s worth of HPC User Forum meeting presentations available to everyone, free, from the HPC User Forum website. It would be a fascinating exercise to review HPC expectation (user and vendor) versus results.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire