IDC: The Changing Face of HPC

By John Russell

July 16, 2015

At IDC’s annual ISC breakfast there was a good deal more than market update numbers although there were plenty of those: “We try to track every server sold, every quarter, worldwide,” said Earl Joseph, IDC program vice president and executive director HPC User Forum. Perhaps more revealing and as important this year was IDC’s unveiling of significant changes in how it is characterizing and measuring the HPC world.

For example, GPU and accelerator tracking has been promoted to a formal activity. But first the top line: The HPC market is growing after a period of sluggishness. It’s now around $10 billion (servers) and IDC expects it to grow substantially this year and steadily through 2019. Also, the collision of HPC and big data has perhaps been bigger than expected and prompted a change in IDC’s data gathering efforts.

“Lately we found that the vendors are having great difficulty determining how many systems go into HPC versus other areas so we’ve had to switch our data collection to very large scale surveys of buyers and users. Last year we exceeded 40,000 surveys and this year it will probably be on the same order of magnitude,” said Joseph.

“One thing I want to mention is about six months ago, with all the data collection we were doing and comparing things, some numbers just stopped adding up and what we found was that the financial sector has been growing faster than we thought and has become much larger over the last two years. In 2010 it didn’t recover much but then went into a hyper growth mode so you will see us restate our numbers for the last two years on the financial sector [up] on the order of at least a 50 percent,” the analyst continued.

For years, IDC has divvied up HPC into four categories: supercomputer; divisional; departmental; and workgroup. Those numbers are shown below.

IDC.ISC.Server Market

However sales for the top ten systems skew cumulative data so much that IDC is changing the way it characterizes the market.

“Except for the top ten systems the whole market grew dramatically. This is something we’ve struggled with in the past couple of years. The plan now is to introduce a fifth competitive segment all round the top ten. We’re not sure what we are going to call it, maybe something like leadership computers, but the market dynamics around those top 5-10 machines are totally different,” said Joseph. The report card and forecast the broader market are shown below.

IDC.ISC.Broad Market

You can see that storage remains hot and is expected to grow faster than other segments; that is being fueled by more data collection and more data analysis. IDC has labeled the fragmented storage market as something of a Wild West, although big players are now turning their attention to that market. IDC suggests similar dynamics in the interconnect space. Middleware is also being watched carefully not least because one would expect increased tool buying to accompany any large-scale movement to upgrade software; the latter of course is seen as a growing pain point in HPC.

Well-noted was the much discussed effect of IBM’s sale of its x86 business to Lenovo. The table below on Q1 2015 sales starkly portrays IBM’s tumble. Joseph commented: “The industry for the last decade was basically tied with two vendors: HP and IBM had 30-34 percent each. Then you had Dell at half of that. All the other vendors had a couple percent or lower. The market now is completely different. HP is the market leader; it’s on the order of a third of the market. We were expecting to see three vendors around 15 percent and the shock here is the IBM decline was higher than expected.”

Joseph was quick to add caution in reading Cray, SGI and the smaller vendors’ one-quarter numbers: “Cray’s going to have a fourth quarter that is three times their market share and so you really have to average the small vendors.” 2015’s Q1 showed ten percent growth relative to 2014 Q1 (See, IDC Says Lenovo Strengthens, IBM Stumbles in Q1 Ranking).

IDC.ISC.Vendor.RevenueTo a fair degree, ongoing market dynamics and IDC’s responses to them were as interesting as the sales numbers. Big data continues to expand and transform the HPC world; cloud adoption for HPC purposes is probably higher than you thought; nascent HPC ROI models are showing dramatic value; and vendor shuffling, perhaps not unexpected following IBM’s x86 business sale, continues. A plus for IBM is emerging traction in the market for OpenPOWER which now has nearly 150 members; over time IBM expects to regain market share in the technical server market along with other members of OpenPOWER.

Among the top HPC trends and watch areas cited by IDC are:

  • 2014 was soft but 2015 won’t be.
  • Big data combined with HPC is creating new solutions.
  • Software issues continue to grow.
  • GPUs & accelerators extend their impact.
  • Non-x86 processors could alter the landscape.
  • China looms large(r).
  • Growing influence of the datacenter in IT food chain.
  • HPC in the cloud gaining traction.

The cloud numbers are interesting. IDC surveyed 157 HPC sites on their use of clouds and roughly 25 percent reported they were using clouds and those who use clouds reported that nearly a third (31.2 percent) of their workloads were run on clouds.

Bob Sorensen, research vice president in IDC’s High Performance Computing group, said, “I should add the caution that some of what they are uploading are what we would consider the low hanging fruit, some of the embarrassingly parallel applications that don’t require very specific architecture. So it’s still early but we expect more sophisticated applications [to be run in the cloud] as time progresses.”

Perhaps the most transformative trend explored is big data convergence with HPC. Just what that means varies widely depending on which vendor is talking, influenced in no small measure but the type of technology they have to sell.

Interestingly, IDC is changing how it examines HPDA activities and introduced four new industry/application workflows including:

  • Fraud and anomaly detection. This “horizontal” workload segment centers around identifying harmful or potentially harmful patterns and causes using graph analysis, semantic analysis, or other high performance analytics techniques.
  • Marketing. This segment covers the use of HPDA to promote products or services, typically using complex algorithms to discern potential customers’ demographics, buying preferences and habits.
  • Business intelligence. The workload segment uses HPDA to identify opportunities to advance the market position and competitiveness of businesses, by better understanding themselves, their competitors, and the evolving dynamics of the markets they participate in.
  • Other Commercial HPDA. This catchall segment includes all commercial HPDA workloads other than the three just described. Over time, IDC expects some of these workloads to become significant enough to split out, i.e. the use of HPDA to manage large IT infrastructures, and Internet-of-Things (IoT) infrastructures.

(Note: financial, classified buyers, etc. will continue to be listed under the existing IDC segments)

Notably, “HPDA adoption has changed over time. It’s made its way into the scientific community and basically reached the point where it stands almost shoulder to shoulder with traditional modeling and simulation,” said Sorensen. Right now there are few rules, and IDC’s advice is to ‘embrace the chaos.’ There is so far no single best or even standard HPDA solution.

Joseph added that in the last six months IDC has conducted seven HPDA surveys, “We’ve studied everything from what are the underlying applications, what are the algorithms, where are the benchmarks, how do you evaluate a good HPDA system and solution, whether a distributed database or single database is better, and what hardware architecture do vendors plan to use to address the HPDA space.”

IDC.ISC.HPDS DiagramLots of questions and opportunities think IDC and virtually everyone else. In a survey of 128 HPC sites about 23 percent reported use of an HPC system for HPDA purposes.

One intriguing topic tackled is HPC ROI measurement. IDC has been developing economic models to try to unravel that question for some time, mostly driven by a contract with DOE. That three-year effort will be likely extended another 6-9 years, said Joseph.

In simple terms, the model input is HPC investment and the output is revenue growth, profit, and job creation. The early work seems exceedingly positive. IDC reported that for every $1 invested in HPC $356 in revenue and $38 in profit were generated – impressive if accurate.

Many more topics were touched upon and IDC is making the report available to all of the session attendees. This was the first year the event was part of ISC’s official agenda. Joseph reviewed HPC User Forum activities and said the group now planned to make all six year’s worth of HPC User Forum meeting presentations available to everyone, free, from the HPC User Forum website. It would be a fascinating exercise to review HPC expectation (user and vendor) versus results.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Pfizer HPC Engineer Aims to Automate Software Stack Testing

January 17, 2019

Seeking to reign in the tediousness of manual software testing, Pfizer HPC Engineer Shahzeb Siddiqui is developing an open source software tool called buildtest, aimed at automating software stack testing by providing the community with a central repository of tests for common HPC apps and the ability to automate execution of testing. Read more…

By Tiffany Trader

Senegal Prepares to Take Delivery of Atos Supercomputer

January 16, 2019

In just a few months time, Senegal will be operating the second largest HPC system in sub-Saharan Africa. The Minister of Higher Education, Research and Innovation Mary Teuw Niane made the announcement... Read more…

By Tiffany Trader

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three big public cloud vendors has by turn touted the latest and Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPE Systems With Intel Omni-Path: Architected for Value and Accessible High-Performance Computing

Today’s high-performance computing (HPC) and artificial intelligence (AI) users value high performing clusters. And the higher the performance that their system can deliver, the better. Read more…

IBM Accelerated Insights

Resource Management in the Age of Artificial Intelligence

New challenges demand fresh approaches

Fueled by GPUs, big data, and rapid advances in software, the AI revolution is upon us. Read more…

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchmark or suite of benchmarking tools to compare the performanc Read more…

By John Russell

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three Read more…

By Tiffany Trader

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchm Read more…

By John Russell

A Big Data Journey While Seeking to Catalog our Universe

January 16, 2019

It turns out, astronomers have lots of photos of the sky but seek knowledge about what the photos mean. Sound familiar? Big data problems are often characterize Read more…

By James Reinders

Intel Bets Big on 2-Track Quantum Strategy

January 15, 2019

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment. Read more…

By Doug Black

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM’s New Global Weather Forecasting System Runs on GPUs

January 9, 2019

Anyone who has checked a forecast to decide whether or not to pack an umbrella knows that weather prediction can be a mercurial endeavor. It is a Herculean task: the constant modeling of incredibly complex systems to a high degree of accuracy at a local level within very short spans of time. Read more…

By Oliver Peckham

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

AMD Sets Up for Epyc Epoch

November 16, 2018

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Intel Confirms 48-Core Cascade Lake-AP for 2019

November 4, 2018

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Microsoft to Buy Mellanox?

December 20, 2018

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…

By Doug Black

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This