The Bright Side of Decline: IDC Sheds Light on HPC Server Market

By Nicole Hemsoth

April 2, 2014

As some of you have already noted, the most recent figures from IDC’s sweep of the HPC server market are in—and from the surface, they don’t suggest a stellar season ahead for supercomputing. However, when put into some broader context, particularly on the international scale with a few massive, surprise systems added to the mix, the big picture is anything but grim.

To recap quickly, the research group found that worldwide factory revenue for the HPC technical server market dipped 7.2%–knocking it to the $10.3 billion mark for the whole of 2013. Interestingly, this decline is based on the previous numbers from 2012, which showcased a record $11.1 billion. This was driven, in large part, by the power of a few versus the seismic forces of an entire market.

“The high end of the market has always been lumpy,” explained Steve Conway, Research Vice President in IDC’s technical computing division. “Ever since supercomputing has existed, there’s been a lot of variability quarter to quarter and year to year. It’s heavily driven by sales of the largest systems, which then throws the bell curve—just as it did here. He notes that even without those very few but massive sales, the high end is marching steadily forward. “If you don’t look at this market as a one-year thing, but rather as a trend line, it’s continuing to grow.”

462108395Conway reminded us that 2012 was very strong but was powered by some anomalies—a few strong sales in particular. The biggest was the K system in Japan, which was $550 million alone. “This, coupled with the Tianhe-2 in China, provided a sudden injection of spending that hadn’t been predicted—and very likely won’t be repeated in the near future. In other words, the decline in these numbers was well-anticipated.

When taking a look at the server market across all segments, HPC, despite a decline in overall spending in the last year, is doing far better than its general business computing brethren. The average selling price of HPC systems is already high and continues to mount in comparison. The difference in IDC’s most recent server market figures is not a matter of selling price, it’s the number of units sold. While that might be lower, the general server market has remained flat over the last several years with around 1% CAGR versus the relatively steady 7% for HPC.

Conway says that there are many considerations that contribute to this disparity in server market numbers, including what customers require from their systems and of course, HPC’s outsider status on some trends that have cheapened commodity enterprise computing, namely virtualization and server consolidation.

Speaking of that difference between the HPC and general business computing markets, there has seemed to be a convergence underway. We’ve been tracking how an increasing number of enterprise users are looking to technical computing servers and tools, which could signal further growth in these IDC HPC numbers. But let’s reverse that for a moment—are HPC shops looking to the commodity, stripped down (think Open Compute type) servers that are storming the enterprise?

According to Conway, these two spaces—HPC and high-end enterprise—are merging, but uni-directionally. He says more commercial firms are now HPC servers but on the flipside, the HPC folks are not really “down shifting” to look to enterprise servers. “The market is getting a little complicated now because there are workloads where the stripped down servers make sense, but on the other end, there’s a higher expectation from other users that even more should be included in clusters.” He says that they’re watching the trend but predicting it is no easy task once we move out another two or three years.

Another fuzzy area is around how new movements, including Hadoop, could alter future IDC HPC server market numbers. At the core of this is the “big data” trend, a great deal of which can be classified as technical computing under IDC’s larger definition of HPC and what is thrown into the mix. With analytics at the center of this defined as technical computing, Hadoop cannot be ignored. But interestingly, despite the noise around Hadoop in commercial circles, the risk-averse business world is less likely to adopt Hadoop for production environments. Conway said that Hadoop use in HPC trumps it in enterprise with 29% of shops reporting they were already using it. While there are a number of ways it must be modified (swapping out the native file system and  other efforts to boost performance) it’s striking that the most visible talk around Hadoop’s role in the world (solving big commercial problems one batch at a time) is from the one segment where the adoption is the slowest.

This all calls to mind the work being done at smaller and mid-sized houses—commercial and academic alike. The server market numbers were brightest for these segments, however, again, this is not just indicative of some massive market push, but rather other, more subtle forces. The most profound culprit for this growth is the recession (remember that?). These segments are bouncing back while large-scale HPC investments weren’t as affected since the wheels of those deals for big systems had been set in motion and were “too big to fail” in a sense. On the other hand, orders around $50,000 (for instance) were more easily put on ice or even cancelled during that uncertain time. They’re back now, reflecting that bounce, but also decent sales.

Among the vendors who stood out in the last year were HP (32.3% share), IBM (27.7%) and strong showings from others, including Cray, which had some sales of both supercomputers and internal products that boosted their revenue 23.4%. Another surprise was Dawning, which closed out the year with a whopping 73.8% revenue growth total over the course of 2012.

Conway stressed the importance of the overall trend line over multiple years for a healthy but hilly market like HPC. He plans on continued growth for the accelerator and coprocessor market, continued investment worldwide in exascale initiatives (although he argues that one common theme among nations is the need to provide a solid ROI argument for sustained funding), and what is already shaping up to be an exciting 2014—from the very peak of computing down to the smaller technical computing cluster levels.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

World Cup is Lame Compared to This Competition

June 18, 2018

So you think World Cup soccer is a big deal? While I’m sure it’s very compelling to watch a bunch of athletes kick a ball around, World Cup misses the boat because it doesn’t include teams putting together their ow Read more…

By Dan Olds

IBM Demonstrates Deep Neural Network Training with Analog Memory Devices

June 18, 2018

From smarter, more personalized apps to seemingly-ubiquitous Google Assistant and Alexa devices, AI adoption is showing no signs of slowing down – and yet, the hardware used for AI is far from perfect. Currently, GPUs Read more…

By Oliver Peckham

Sandia to Take Delivery of World’s Largest Arm System

June 18, 2018

While the enterprise remains circumspect on prospects for Arm servers in the datacenter, the leadership HPC community is taking a bolder, brighter view of the x86 server CPU alternative. Amongst current and planned Arm HPC installations – i.e., the innovative Mont-Blanc project, led by Bull/Atos, the 'Isambard’ Cray XC50 going into the University of Bristol, and commitments from both Japan and France among others -- HPE is announcing that it will be supply the United States National Nuclear Security Administration (NNSA) with a 2.3 petaflops peak Arm-based system, named Astra. Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPC and AI Convergence is Accelerating New Levels of Intelligence

Data analytics is the most valuable tool in the digital marketplace – so much so that organizations are employing high performance computing (HPC) capabilities to rapidly collect, share, and analyze endless streams of data. Read more…

IBM Accelerated Insights

Banks Boost Infrastructure to Tackle GDPR

As banks become more digital and data-driven, their IT managers are challenged with fast growing data volumes and lines-of-businesses’ (LoBs’) seemingly limitless appetite for analytics. Read more…

Challenges Face Astroinformatics as It Sorts Through the Stars

June 15, 2018

You might have seen one of those YouTube videos: they begin on Earth, slowly zooming out to the Moon, the Solar System, the Milky Way, beyond – and suddenly, you’re looking at trillions of stars. It’s a lot to take Read more…

By Oliver Peckham

Sandia to Take Delivery of World’s Largest Arm System

June 18, 2018

While the enterprise remains circumspect on prospects for Arm servers in the datacenter, the leadership HPC community is taking a bolder, brighter view of the x86 server CPU alternative. Amongst current and planned Arm HPC installations – i.e., the innovative Mont-Blanc project, led by Bull/Atos, the 'Isambard’ Cray XC50 going into the University of Bristol, and commitments from both Japan and France among others -- HPE is announcing that it will be supply the United States National Nuclear Security Administration (NNSA) with a 2.3 petaflops peak Arm-based system, named Astra. Read more…

By Tiffany Trader

The Machine Learning Hype Cycle and HPC

June 14, 2018

Like many other HPC professionals I’m following the hype cycle around machine learning/deep learning with interest. I subscribe to the view that we’re probably approaching the ‘peak of inflated expectation’ but not quite yet starting the descent into the ‘trough of disillusionment. This still raises the probability that... Read more…

By Dairsie Latimer

Xiaoxiang Zhu Receives the 2018 PRACE Ada Lovelace Award for HPC

June 13, 2018

Xiaoxiang Zhu, who works for the German Aerospace Center (DLR) and Technical University of Munich (TUM), was awarded the 2018 PRACE Ada Lovelace Award for HPC for her outstanding contributions in the field of high performance computing (HPC) in Europe. Read more…

By Elizabeth Leake

U.S Considering Launch of National Quantum Initiative

June 11, 2018

Sometime this month the U.S. House Science Committee will introduce legislation to launch a 10-year National Quantum Initiative, according to a recent report by Read more…

By John Russell

ORNL Summit Supercomputer Is Officially Here

June 8, 2018

Oak Ridge National Laboratory (ORNL) together with IBM and Nvidia celebrated the official unveiling of the Department of Energy (DOE) Summit supercomputer toda Read more…

By Tiffany Trader

Exascale USA – Continuing to Move Forward

June 6, 2018

The end of May 2018, saw several important events that continue to advance the Department of Energy’s (DOE) Exascale Computing Initiative (ECI) for the United Read more…

By Alex R. Larzelere

Exascale for the Rest of Us: Exaflops Systems Capable for Industry

June 6, 2018

Enterprise advanced scale computing – or HPC in the enterprise – is an entity unto itself, situated between (and with characteristics of) conventional enter Read more…

By Doug Black

Fracas in Frankfurt: ISC18 Cluster Competition Teams Unveiled

June 6, 2018

The Student Cluster Competition season heats up with the seventh edition of the ISC Student Cluster Competition, slated to begin on June 25th in Frankfurt, Germ Read more…

By Dan Olds

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

ORNL Summit Supercomputer Is Officially Here

June 8, 2018

Oak Ridge National Laboratory (ORNL) together with IBM and Nvidia celebrated the official unveiling of the Department of Energy (DOE) Summit supercomputer toda Read more…

By Tiffany Trader

HPE Wins $57 Million DoD Supercomputing Contract

February 20, 2018

Hewlett Packard Enterprise (HPE) today revealed details of its massive $57 million HPC contract with the U.S. Department of Defense (DoD). The deal calls for HP Read more…

By Tiffany Trader

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

Hennessy & Patterson: A New Golden Age for Computer Architecture

April 17, 2018

On Monday June 4, 2018, 2017 A.M. Turing Award Winners John L. Hennessy and David A. Patterson will deliver the Turing Lecture at the 45th International Sympo Read more…

By Staff

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

Google I/O 2018: AI Everywhere; TPU 3.0 Delivers 100+ Petaflops but Requires Liquid Cooling

May 9, 2018

All things AI dominated discussion at yesterday’s opening of Google’s I/O 2018 developers meeting covering much of Google's near-term product roadmap. The e Read more…

By John Russell

Nvidia Ups Hardware Game with 16-GPU DGX-2 Server and 18-Port NVSwitch

March 27, 2018

Nvidia unveiled a raft of new products from its annual technology conference in San Jose today, and despite not offering up a new chip architecture, there were still a few surprises in store for HPC hardware aficionados. Read more…

By Tiffany Trader

Pattern Computer – Startup Claims Breakthrough in ‘Pattern Discovery’ Technology

May 23, 2018

If it weren’t for the heavy-hitter technology team behind start-up Pattern Computer, which emerged from stealth today in a live-streamed event from San Franci Read more…

By John Russell

Part One: Deep Dive into 2018 Trends in Life Sciences HPC

March 1, 2018

Life sciences is an interesting lens through which to see HPC. It is perhaps not an obvious choice, given life sciences’ relative newness as a heavy user of H Read more…

By John Russell

Intel Pledges First Commercial Nervana Product ‘Spring Crest’ in 2019

May 24, 2018

At its AI developer conference in San Francisco yesterday, Intel embraced a holistic approach to AI and showed off a broad AI portfolio that includes Xeon processors, Movidius technologies, FPGAs and Intel’s Nervana Neural Network Processors (NNPs), based on the technology it acquired in 2016. Read more…

By Tiffany Trader

Google Charts Two-Dimensional Quantum Course

April 26, 2018

Quantum error correction, essential for achieving universal fault-tolerant quantum computation, is one of the main challenges of the quantum computing field and it’s top of mind for Google’s John Martinis. At a presentation last week at the HPC User Forum in Tucson, Martinis, one of the world's foremost experts in quantum computing, emphasized... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This