IDC: HPC Will Resume Growth After Dipping in 2009

By Nicole Hemsoth

January 29, 2009

‘Tis the season for IDC’s annual HPC market forecast, only this time around it needs to consider a global economic recession. In this exclusive interview, HPCwire quizzes Earl Joseph, IDC’s program vice president for HPC, about what’s in store for 2009.

HPCwire: IDC recently revised its forecast for the HPC market. In a nutshell, what is the new forecast?

Earl Joseph: Based on actual numbers for the first three quarters of 2008 and modeled fourth-quarter numbers, IDC estimates that full-year 2008 HPC server revenue will come in at around $9.6 billion. That’s down 4.2 percent from 2007. Our new forecast predicts HPC server revenue will dip about 5.4 percent in 2009, then start modest growth again in 2010 and rebound to 9 percent-plus growth, eventually reaching $11.7 billion in revenue by 2012.

HPCwire: Why did you revise the forecast?

Joseph: We normally do a new five-year HPC market forecast at this time of year. The third-quarter data on HPC servers that we received from the hardware vendors in late 2008 showed that the global economic recession was already throttling down revenue in HPC. That greatly affected our new forecast.

HPCwire: How did you come up with the revised forecast?

Joseph: Each year we go through a specific, careful process to come up with a new five-year forecast. Our HPC team that includes Jie Wu, Steve Conway, Richard Walsh and myself gets together for two full days to go through this process. We looked carefully at the actual data for the prior five years, especially the most recent quarters. We analyzed IDC’s assumptions and projections for the global IT market and the whole server market. Then we first created a Q4 forecast that also provides a full year 2008 forecast. Then we constructed a table of assumptions about the factors that are most likely to influence the HPC server market in 2009 and beyond. With this foundation, we created our five-year forecast for the HPC server market, the competitive price band segments of the market, and so on.

HPCwire: How confident are you in the forecast?

Joseph: We’re fairly confident about the major assumptions and the general trends in the predictions. We all wish that we had a crystal ball that would tell what is going to happen in the overall world economy. It is always harder to forecast during periods of major ups and downs, and this was one of the hardest times for creating forecasts. Our forecasts are intentionally on the conservative side and in recent years the HPC market has consistently beaten our forecasts.

Another thing that gives us confidence is that in late 2008, when the impact of the economic downturn was already starting to be felt, we conducted an extensive worldwide, in-depth study of 110 HPC sites of all sizes and in all major sectors. These were two to three-hour interviews that produced more data points than what would fit into Excel. The topics ranged from HPC systems to processors, storage, interconnects, system software, budgets, TCO, and application workloads, and we’re churning out separate reports on these topics right now. We asked the sites not just about what they’re doing today, but also about their requirements for the next round of HPC purchasing, including the attributes that would command premium pricing. Not one of the 110 government, industrial, and academic sites planned to reduce HPC use in 2009, although we expect them to be more conservative about new spending. We received a good general sense of where budgets and spending are headed directly from the HPC buyers. That gave us some additional confidence when we put together our forecast.

HPCwire: How does HPC compare with the whole IT market?

Joseph: If our baseline assumptions about the HPC market are right, HPC will come out of the recession the same way it went in, as a bright spot in the IT sector. As I mentioned earlier, we expect HPC to start growing again in 2010 and to be on a robust growth path again by the end of our forecast period in 2012.

HPCwire: What will the main effects of the global economic recession be, where HPC is concerned?

Joseph: We expect users to become more conservative about new spending, but most existing plans will go forward, though there may be delays of a quarter or more in some cases. There will be more focus on cost-effectiveness and this will favor clusters and other standards-based solutions. Competition will heat up for new business, and some weaker vendors may close their doors while stronger ones tighten their belts. More sites will apply simulation and analysis to their existing datacenter designs to grow performance with minimal impact on power, cooling, and facility space. And with tighter controls on capital spending, some increases are expected in CAPEX-free HPC cycles delivered via service-oriented grids, or maybe cloud computing in some cases. This will become more appealing to new users and for periodic, overflow work.

HPCwire: Which HPC segments will be most affected by the global economic recession?

Joseph: Some automotive and financial services firms are so hard pressed that we see them shrinking CAPEX even in mission-critical areas, including HPC. North American automakers will generally take more drastic steps than their Japanese and European counterparts in these reductions.

In sharp contrast, HPC is deeply embedded in the R&D process of oil and gas companies, and most of these companies are in good shape financially even with lower energy prices. We don’t expect much in the way of budget cuts and we see HPC growth plans being carried out, although some purchases may be delayed.

Government and academia together make up over 65 percent of the HPC server market. They’ll probably follow historical patterns and react less quickly and less deeply to the economic downturn than the private sector does.

HPCwire: How much of the HPC market does government spending make up? What impact does IDC expect from the Obama Administration?

Joseph: The U.S. Government is the world’s largest HPC customer, and the new U.S. Administration has said it plans to boost spending on science and technology, so that’s a hopeful sign. HPC should also be critical for the alternative energy research that is one of President Obama’s top priorities. But in the U.S. and around the world, HPC will compete for funding with other urgent priorities and not all new HPC initiatives will get funded, or funded fully, in 2009. And in the U.S., the change of Administration could delay funding for new initiatives and for the expansion of existing HPC-related science and technology programs. We expect that some weapons-based HPC work will likely be redirected and that some procurements may be delayed to free up funding for higher priority projects.

HPCwire: Will IDC revisit this forecast?

Joseph: We plan to fully update the forecast once a quarter for a while, after we receive and analyze the results from vendor quarterly sales. We are hopeful that the HPC market will recover faster than we are currently projecting, but it will depend heavily on how long the economic slowdown lasts.

HPCwire: What other major developments do you expect in the HPC market in 2009?

Joseph: Overall, we expect 2009 to be another year of evolutionary change in the HPC market. Incremental advances will ease the pain of dealing with the massive increase in core counts, but they won’t cure the big issues of highly parallel programming, power and cooling costs, software licensing costs, ease of use, and so on. There will likely be a number of exciting new petascale installations in 2009.

The HPC storage market will stay stronger than the server market through the recession period. We expect that “ease-of-everything” solutions will continue to grow in the low-end workgroup segment and start spreading to higher price point systems. The research we did for the Council on Competitiveness showed that HPC use is already a metric for industrial competitiveness in tier 1 firms. In 2009, we think HPC use in these firms’ supply chains will start to become a competitiveness metric.

Standards-based clusters will gain market share in the price-sensitive economy, but more and more HPC sites will experience retrograde performance on some key codes. In a survey we did in the second half of 2008, half of the HPC sites said that within the next 12 months they expected some of their codes to run more slowly on their newest HPC system than on the previous one. That’s a disturbing new trend that’s being driven by escalating core counts and the inability to move data in and out of each core fast enough to keep the cores busy. It’s exacerbated by energy-saving, tuned-down processor speeds that reduce single-threaded performance. Most HPC sites would rather see faster clock rates.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

European HPC Summit Week and PRACEdays 2018: Slaying Dragons and SHAPEing Futures One SME at a Time

June 20, 2018

The University of Ljubljana in Slovenia hosted the third annual EHPCSW18 and fifth annual PRACEdays18 events which opened May 29, 2018. The conference was chaired by PRACE Council Vice-Chair Sergi Girona (Barcelona Super Read more…

By Elizabeth Leake (STEM-Trek for HPCwire)

An Overview of ‘OpenACC for Programmers’ from the Book’s Editors

June 20, 2018

In an era of multicore processors coupled with manycore accelerators in all kinds of devices from smartphones all the way to supercomputers, it is important to train current and future computational scientists of all dom Read more…

By Sunita Chandrasekaran and Guido Juckeland

Cray Introduces All Flash Lustre Storage Solution Targeting HPC

June 19, 2018

Citing the rise of IOPS-intensive workflows and more affordable flash technology, Cray today introduced the L300F, a scalable all-flash storage solution whose primary use case is to support high IOPS rates to/from a scra Read more…

By John Russell

HPE Extreme Performance Solutions

HPC and AI Convergence is Accelerating New Levels of Intelligence

Data analytics is the most valuable tool in the digital marketplace – so much so that organizations are employing high performance computing (HPC) capabilities to rapidly collect, share, and analyze endless streams of data. Read more…

IBM Accelerated Insights

Preview the World’s Smartest Supercomputer at ISC 2018

Introducing an accelerated IT infrastructure for HPC & AI workloads Read more…

Lenovo to Debut ‘Neptune’ Cooling Technologies at ISC

June 19, 2018

Lenovo today announced a set of cooling technologies, dubbed Neptune, that include direct to node (DTN) warm water cooling, rear door heat exchanger (RDHX), and hybrid solutions that combine air and liquid cooling. Lenov Read more…

By John Russell

European HPC Summit Week and PRACEdays 2018: Slaying Dragons and SHAPEing Futures One SME at a Time

June 20, 2018

The University of Ljubljana in Slovenia hosted the third annual EHPCSW18 and fifth annual PRACEdays18 events which opened May 29, 2018. The conference was chair Read more…

By Elizabeth Leake (STEM-Trek for HPCwire)

Cray Introduces All Flash Lustre Storage Solution Targeting HPC

June 19, 2018

Citing the rise of IOPS-intensive workflows and more affordable flash technology, Cray today introduced the L300F, a scalable all-flash storage solution whose p Read more…

By John Russell

Sandia to Take Delivery of World’s Largest Arm System

June 18, 2018

While the enterprise remains circumspect on prospects for Arm servers in the datacenter, the leadership HPC community is taking a bolder, brighter view of the x86 server CPU alternative. Amongst current and planned Arm HPC installations – i.e., the innovative Mont-Blanc project, led by Bull/Atos, the 'Isambard’ Cray XC50 going into the University of Bristol, and commitments from both Japan and France among others -- HPE is announcing that it will be supply the United States National Nuclear Security Administration (NNSA) with a 2.3 petaflops peak Arm-based system, named Astra. Read more…

By Tiffany Trader

The Machine Learning Hype Cycle and HPC

June 14, 2018

Like many other HPC professionals I’m following the hype cycle around machine learning/deep learning with interest. I subscribe to the view that we’re probably approaching the ‘peak of inflated expectation’ but not quite yet starting the descent into the ‘trough of disillusionment. This still raises the probability that... Read more…

By Dairsie Latimer

Xiaoxiang Zhu Receives the 2018 PRACE Ada Lovelace Award for HPC

June 13, 2018

Xiaoxiang Zhu, who works for the German Aerospace Center (DLR) and Technical University of Munich (TUM), was awarded the 2018 PRACE Ada Lovelace Award for HPC for her outstanding contributions in the field of high performance computing (HPC) in Europe. Read more…

By Elizabeth Leake

U.S Considering Launch of National Quantum Initiative

June 11, 2018

Sometime this month the U.S. House Science Committee will introduce legislation to launch a 10-year National Quantum Initiative, according to a recent report by Read more…

By John Russell

ORNL Summit Supercomputer Is Officially Here

June 8, 2018

Oak Ridge National Laboratory (ORNL) together with IBM and Nvidia celebrated the official unveiling of the Department of Energy (DOE) Summit supercomputer toda Read more…

By Tiffany Trader

Exascale USA – Continuing to Move Forward

June 6, 2018

The end of May 2018, saw several important events that continue to advance the Department of Energy’s (DOE) Exascale Computing Initiative (ECI) for the United Read more…

By Alex R. Larzelere

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

ORNL Summit Supercomputer Is Officially Here

June 8, 2018

Oak Ridge National Laboratory (ORNL) together with IBM and Nvidia celebrated the official unveiling of the Department of Energy (DOE) Summit supercomputer toda Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Hennessy & Patterson: A New Golden Age for Computer Architecture

April 17, 2018

On Monday June 4, 2018, 2017 A.M. Turing Award Winners John L. Hennessy and David A. Patterson will deliver the Turing Lecture at the 45th International Sympo Read more…

By Staff

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

Google I/O 2018: AI Everywhere; TPU 3.0 Delivers 100+ Petaflops but Requires Liquid Cooling

May 9, 2018

All things AI dominated discussion at yesterday’s opening of Google’s I/O 2018 developers meeting covering much of Google's near-term product roadmap. The e Read more…

By John Russell

Nvidia Ups Hardware Game with 16-GPU DGX-2 Server and 18-Port NVSwitch

March 27, 2018

Nvidia unveiled a raft of new products from its annual technology conference in San Jose today, and despite not offering up a new chip architecture, there were still a few surprises in store for HPC hardware aficionados. Read more…

By Tiffany Trader

Pattern Computer – Startup Claims Breakthrough in ‘Pattern Discovery’ Technology

May 23, 2018

If it weren’t for the heavy-hitter technology team behind start-up Pattern Computer, which emerged from stealth today in a live-streamed event from San Franci Read more…

By John Russell

HPE Wins $57 Million DoD Supercomputing Contract

February 20, 2018

Hewlett Packard Enterprise (HPE) today revealed details of its massive $57 million HPC contract with the U.S. Department of Defense (DoD). The deal calls for HP Read more…

By Tiffany Trader

Part One: Deep Dive into 2018 Trends in Life Sciences HPC

March 1, 2018

Life sciences is an interesting lens through which to see HPC. It is perhaps not an obvious choice, given life sciences’ relative newness as a heavy user of H Read more…

By John Russell

Intel Pledges First Commercial Nervana Product ‘Spring Crest’ in 2019

May 24, 2018

At its AI developer conference in San Francisco yesterday, Intel embraced a holistic approach to AI and showed off a broad AI portfolio that includes Xeon processors, Movidius technologies, FPGAs and Intel’s Nervana Neural Network Processors (NNPs), based on the technology it acquired in 2016. Read more…

By Tiffany Trader

Google Charts Two-Dimensional Quantum Course

April 26, 2018

Quantum error correction, essential for achieving universal fault-tolerant quantum computation, is one of the main challenges of the quantum computing field and it’s top of mind for Google’s John Martinis. At a presentation last week at the HPC User Forum in Tucson, Martinis, one of the world's foremost experts in quantum computing, emphasized... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This