New Year’s Foreboding

By Michael Feldman

January 7, 2009

As is usual for the supercomputing world in early January, news is hard to come by. With so many academics in the community, a lot of HPC practitioners are still on their extended winter breaks. As for commercial HPC companies, they may not be so eager to return to work to confront the new economic realities they’ll be facing in 2009.

With the recession in full swing, the most likely news for HPC vendors may be bad news. Layoffs, mergers and other forms of retrenchment may end up being standard operating procedure for these organizations in 2009. Even though HPC-related businesses are expected to be spared the worst of the recession, that doesn’t mean there won’t be consequences. In fact, some of the effects of the downturn are already in evidence.

At the end of 2008, Cray executed a Goodwill write-down of about $55.4 million because the company’s stock price fell below its net assets. Even though Cray was able to book $100 million in Q4 as a result of the official acceptance of the new Jaguar petaflop system by Oak Ridge National Lab, the non-cash write-down meant that the company still suffered a net loss for 2008.

Cray’s stock was not the only one tanking. Practically every public IT company with (or without) an HPC stake rode the steep market decline that began in September last year. By the end of 2008, stocks of IBM, HP, Sun Microsystems, SGI Dell, Microsoft, Mellanox, Voltaire and ANSYS reflected the general market sell-off that occurred in the fourth quarter; most losing one-third of their value or more.

Stocks of chipmakers Intel, AMD, NVIDIA and Xilinx followed a similar pattern. The semiconductor industry, in general, is in for a rough ride this year. According to BusinessWeek, the global semiconductor industry is going to be hit hard by reduced demand for chips worldwide, just as production capacity is peaking.

The extent of the recession’s effect on the HPC market is going to be the big question in 2009. A lot depends on how governments around the world react to the financial turmoil. Pumping up R&D and industrial users of HPC would help maintain demand, but with so many sectors in trouble, even all the government stimulus packages being considered in the big industrialized nations will be unable to cover everyone.

In particular, the financial services sector is likely to consolidate, regardless of bailouts or stimulus spending, although this may not translate into reduced HPC spending. According to a recent survey of IT professionals at tier-one banks, HPC investments will grow in 2009. This probably reflects the view that IT spending, in general, makes sense in practically all economic climates, since it helps increase worker productivity. In good times this translates into larger revenues; in bad times it means workers can be replaced with technology.

Other major HPC sectors should be able to weather the economic downturn fairly well, too. Flush with cash, oil and gas companies are not in any immediate trouble, although the lower demand (and prices) for hydrocarbon fuels will inhibit some seismic exploration ventures, at least in the near term. Likewise, biotech companies that have plenty of cash on hand should also be able navigate through the recession, although startups may starve for funds because of the lack of investor capital and the tight credit market.

Where governments can help most is getting the overall economy back on track and making some key investments. In the U.S., President-Elect’s Obama’s plans for a government stimulus package are still being drawn up this month. More than half of the nearly $800 billion proposed will go towards tax relief, which will do little to spur computing demand. But the remainder is to be spent on programs, including infrastructure build-out and energy research and development — both of which could entail new HPC activity, albeit not immediately.

Obama and the Democratic Congress are also likely to fulfill their commitment to double federal funding of basic research over 10 years and make the R&D tax credit for corporations permanent. Since this entails only a few billion dollars per year, there probably won’t be a lot of opposition to this type of spending, especially when seen against a backdrop of trillion dollar per year deficit spending.

The real problem is at the state level, where, according to the Center on Policy and Budget Priorities, the cumulative budget shortfall will be $89 billion in 2009, ballooning to $145 billion in 2010 and $180 billion in 2011. If the stimulus package doesn’t cover shortfalls at the state level and the feds don’t step in with extra money, university and other state-sponsored R&D funding could face severe cutbacks. The governor of Wyoming has already made up a $1 billion wish list for Obama to rescue his state’s projects, including $50 million for a supercomputer at the University of Wyoming. Other states are sure to start lining up for federal funds.

HPC users are understandably wary. In a poll taken by Douglas Eadline at Linux Magazine in December, 40 percent of the 42 respondents said the economic downturn would not effect their HPC budget plans for 2009. Eadline suggests those are pretty good numbers, considering the rest of the IT industry appears to be headed for a more severe downturn. John West, at insideHPC, adds his two cents on the topic, noting that HPC use is precariously balanced on the backs of the HPC providers:

HPC vendors are experiencing economic disruptions because razor thin margins and little access to working capital mean they are hypersensitive to even small changes in the market. But they could stabilize where they are now and slowly return back to the “critical, but stable” state they’ve declined into over the past decade. However, if the disruption pushes into the HPC provider community — the labs and departments that provide HPC cycles and expertise to users — and we see a large scale reduction in employment and acquisition, then I think we’ll be in for a wholesale restructuring of the HPC market.

The open question is whether the downturn will be resolved in the next four quarters. Because of the long intervals between budget planning, procurement and deployment, it’s quite possible that any trouble exposed in 2009 will only be a prelude to deeper problems in 2010 and beyond. But if the recession is basically over by the end of this year, the hangover should be relatively mild. Otherwise, HPC growth is likely to take a much lower trajectory for the foreseeable future.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Google Frames Quantum Race as Two-Dimensional

April 26, 2018

Quantum error correction, essential for achieving universal fault-tolerant quantum computation, is one of the main challenges of the quantum computing field and it’s top of mind for Google’s John Martinis. At a pres Read more…

By Tiffany Trader

Affordable Optical Technology Needed Says HPE’s Daley

April 26, 2018

While not new, the challenges presented by computer cabling/PCB circuit routing design – cost, performance, space requirements, and power management – have coalesced into a major headache in advanced HPC system desig Read more…

By John Russell

AI-Focused ‘Genius’ Supercomputer Installed at KU Leuven

April 24, 2018

Hewlett Packard Enterprise has deployed a new approximately half-petaflops supercomputer, named Genius, at Flemish research university KU Leuven. The system is built to run artificial intelligence (AI) workloads and, as Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

Hybrid HPC is Speeding Time to Insight and Revolutionizing Medicine

High performance computing (HPC) is a key driver of success in many verticals today, and health and life science industries are extensively leveraging these capabilities. Read more…

New Exascale System for Earth Simulation Introduced

April 23, 2018

After four years of development, the Energy Exascale Earth System Model (E3SM) will be unveiled today and released to the broader scientific community this month. The E3SM project is supported by the Department of Energy Read more…

By Staff

Google Frames Quantum Race as Two-Dimensional

April 26, 2018

Quantum error correction, essential for achieving universal fault-tolerant quantum computation, is one of the main challenges of the quantum computing field an Read more…

By Tiffany Trader

Affordable Optical Technology Needed Says HPE’s Daley

April 26, 2018

While not new, the challenges presented by computer cabling/PCB circuit routing design – cost, performance, space requirements, and power management – have Read more…

By John Russell

AI-Focused ‘Genius’ Supercomputer Installed at KU Leuven

April 24, 2018

Hewlett Packard Enterprise has deployed a new approximately half-petaflops supercomputer, named Genius, at Flemish research university KU Leuven. The system is Read more…

By Tiffany Trader

Cray Rolls Out AMD-Based CS500; More to Follow?

April 18, 2018

Cray was the latest OEM to bring AMD back into the fold with introduction today of a CS500 option based on AMD’s Epyc processor line. The move follows Cray’ Read more…

By John Russell

IBM: Software Ecosystem for OpenPOWER is Ready for Prime Time

April 16, 2018

With key pieces of the IBM/OpenPOWER versus Intel/x86 gambit settling into place – e.g., the arrival of Power9 chips and Power9-based systems, hyperscaler sup Read more…

By John Russell

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Cloud-Readiness and Looking Beyond Application Scaling

April 11, 2018

There are two aspects to consider when determining if an application is suitable for running in the cloud. The first, which we will discuss here under the title Read more…

By Chris Downing

Transitioning from Big Data to Discovery: Data Management as a Keystone Analytics Strategy

April 9, 2018

The past 10-15 years has seen a stark rise in the density, size, and diversity of scientific data being generated in every scientific discipline in the world. Key among the sciences has been the explosion of laboratory technologies that generate large amounts of data in life-sciences and healthcare research. Large amounts of data are now being stored in very large storage name spaces, with little to no organization and a general unease about how to approach analyzing it. Read more…

By Ari Berman, BioTeam, Inc.

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

Leading Solution Providers

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

HPC and AI – Two Communities Same Future

January 25, 2018

According to Al Gara (Intel Fellow, Data Center Group), high performance computing and artificial intelligence will increasingly intertwine as we transition to Read more…

By Rob Farber

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Te Read more…

By John Russell

Momentum Builds for US Exascale

January 9, 2018

2018 looks to be a great year for the U.S. exascale program. The last several months of 2017 revealed a number of important developments that help put the U.S. Read more…

By Alex R. Larzelere

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This