New Year’s Foreboding

By Michael Feldman

January 7, 2009

As is usual for the supercomputing world in early January, news is hard to come by. With so many academics in the community, a lot of HPC practitioners are still on their extended winter breaks. As for commercial HPC companies, they may not be so eager to return to work to confront the new economic realities they’ll be facing in 2009.

With the recession in full swing, the most likely news for HPC vendors may be bad news. Layoffs, mergers and other forms of retrenchment may end up being standard operating procedure for these organizations in 2009. Even though HPC-related businesses are expected to be spared the worst of the recession, that doesn’t mean there won’t be consequences. In fact, some of the effects of the downturn are already in evidence.

At the end of 2008, Cray executed a Goodwill write-down of about $55.4 million because the company’s stock price fell below its net assets. Even though Cray was able to book $100 million in Q4 as a result of the official acceptance of the new Jaguar petaflop system by Oak Ridge National Lab, the non-cash write-down meant that the company still suffered a net loss for 2008.

Cray’s stock was not the only one tanking. Practically every public IT company with (or without) an HPC stake rode the steep market decline that began in September last year. By the end of 2008, stocks of IBM, HP, Sun Microsystems, SGI Dell, Microsoft, Mellanox, Voltaire and ANSYS reflected the general market sell-off that occurred in the fourth quarter; most losing one-third of their value or more.

Stocks of chipmakers Intel, AMD, NVIDIA and Xilinx followed a similar pattern. The semiconductor industry, in general, is in for a rough ride this year. According to BusinessWeek, the global semiconductor industry is going to be hit hard by reduced demand for chips worldwide, just as production capacity is peaking.

The extent of the recession’s effect on the HPC market is going to be the big question in 2009. A lot depends on how governments around the world react to the financial turmoil. Pumping up R&D and industrial users of HPC would help maintain demand, but with so many sectors in trouble, even all the government stimulus packages being considered in the big industrialized nations will be unable to cover everyone.

In particular, the financial services sector is likely to consolidate, regardless of bailouts or stimulus spending, although this may not translate into reduced HPC spending. According to a recent survey of IT professionals at tier-one banks, HPC investments will grow in 2009. This probably reflects the view that IT spending, in general, makes sense in practically all economic climates, since it helps increase worker productivity. In good times this translates into larger revenues; in bad times it means workers can be replaced with technology.

Other major HPC sectors should be able to weather the economic downturn fairly well, too. Flush with cash, oil and gas companies are not in any immediate trouble, although the lower demand (and prices) for hydrocarbon fuels will inhibit some seismic exploration ventures, at least in the near term. Likewise, biotech companies that have plenty of cash on hand should also be able navigate through the recession, although startups may starve for funds because of the lack of investor capital and the tight credit market.

Where governments can help most is getting the overall economy back on track and making some key investments. In the U.S., President-Elect’s Obama’s plans for a government stimulus package are still being drawn up this month. More than half of the nearly $800 billion proposed will go towards tax relief, which will do little to spur computing demand. But the remainder is to be spent on programs, including infrastructure build-out and energy research and development — both of which could entail new HPC activity, albeit not immediately.

Obama and the Democratic Congress are also likely to fulfill their commitment to double federal funding of basic research over 10 years and make the R&D tax credit for corporations permanent. Since this entails only a few billion dollars per year, there probably won’t be a lot of opposition to this type of spending, especially when seen against a backdrop of trillion dollar per year deficit spending.

The real problem is at the state level, where, according to the Center on Policy and Budget Priorities, the cumulative budget shortfall will be $89 billion in 2009, ballooning to $145 billion in 2010 and $180 billion in 2011. If the stimulus package doesn’t cover shortfalls at the state level and the feds don’t step in with extra money, university and other state-sponsored R&D funding could face severe cutbacks. The governor of Wyoming has already made up a $1 billion wish list for Obama to rescue his state’s projects, including $50 million for a supercomputer at the University of Wyoming. Other states are sure to start lining up for federal funds.

HPC users are understandably wary. In a poll taken by Douglas Eadline at Linux Magazine in December, 40 percent of the 42 respondents said the economic downturn would not effect their HPC budget plans for 2009. Eadline suggests those are pretty good numbers, considering the rest of the IT industry appears to be headed for a more severe downturn. John West, at insideHPC, adds his two cents on the topic, noting that HPC use is precariously balanced on the backs of the HPC providers:

HPC vendors are experiencing economic disruptions because razor thin margins and little access to working capital mean they are hypersensitive to even small changes in the market. But they could stabilize where they are now and slowly return back to the “critical, but stable” state they’ve declined into over the past decade. However, if the disruption pushes into the HPC provider community — the labs and departments that provide HPC cycles and expertise to users — and we see a large scale reduction in employment and acquisition, then I think we’ll be in for a wholesale restructuring of the HPC market.

The open question is whether the downturn will be resolved in the next four quarters. Because of the long intervals between budget planning, procurement and deployment, it’s quite possible that any trouble exposed in 2009 will only be a prelude to deeper problems in 2010 and beyond. But if the recession is basically over by the end of this year, the hangover should be relatively mild. Otherwise, HPC growth is likely to take a much lower trajectory for the foreseeable future.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

UCSD, AIST Forge Tighter Alliance with AI-Focused MOU

January 18, 2018

The rich history of collaboration between UC San Diego and AIST in Japan is getting richer. The organizations entered into a five-year memorandum of understanding on January 10. The MOU represents the continuation of a 1 Read more…

By Tiffany Trader

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Tennessee), Satoshi Matsuoka (Tokyo Institute of Technology), Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown and Spectre security updates on the performance of popular H Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPE and NREL Take Steps to Create a Sustainable, Energy-Efficient Data Center with an H2 Fuel Cell

As enterprises attempt to manage rising volumes of data, unplanned data center outages are becoming more common and more expensive. As the cost of downtime rises, enterprises lose out on productivity and valuable competitive advantage without access to their critical data. Read more…

Fostering Lustre Advancement Through Development and Contributions

January 17, 2018

Six months after organizational changes at Intel's High Performance Data (HPDD) division, most in the Lustre community have shed any initial apprehension around the potential changes that could affect or disrupt Lustre Read more…

By Carlos Aoki Thomaz

UCSD, AIST Forge Tighter Alliance with AI-Focused MOU

January 18, 2018

The rich history of collaboration between UC San Diego and AIST in Japan is getting richer. The organizations entered into a five-year memorandum of understandi Read more…

By Tiffany Trader

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Te Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

Fostering Lustre Advancement Through Development and Contributions

January 17, 2018

Six months after organizational changes at Intel's High Performance Data (HPDD) division, most in the Lustre community have shed any initial apprehension aroun Read more…

By Carlos Aoki Thomaz

When the Chips Are Down

January 11, 2018

In the last article, "The High Stakes Semiconductor Game that Drives HPC Diversity," I alluded to the challenges facing the semiconductor industry and how that may impact the evolution of HPC systems over the next few years. I thought I’d lift the covers a little and look at some of the commercial challenges that impact the component technology we use in HPC. Read more…

By Dairsie Latimer

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

Momentum Builds for US Exascale

January 9, 2018

2018 looks to be a great year for the U.S. exascale program. The last several months of 2017 revealed a number of important developments that help put the U.S. Read more…

By Alex R. Larzelere

ANL’s Rick Stevens on CANDLE, ARM, Quantum, and More

January 8, 2018

Late last year HPCwire caught up with Rick Stevens, associate laboratory director for computing, environment and life Sciences at Argonne National Laboratory, f Read more…

By John Russell

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

GlobalFoundries Puts Wind in AMD’s Sails with 12nm FinFET

September 24, 2017

From its annual tech conference last week (Sept. 20), where GlobalFoundries welcomed more than 600 semiconductor professionals (reaching the Santa Clara venue Read more…

By Tiffany Trader

Leading Solution Providers

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

HPC Chips – A Veritable Smorgasbord?

October 10, 2017

For the first time since AMD's ill-fated launch of Bulldozer the answer to the question, 'Which CPU will be in my next HPC system?' doesn't have to be 'Whichever variety of Intel Xeon E5 they are selling when we procure'. Read more…

By Dairsie Latimer

  • arrow
  • Click Here for More Headlines
  • arrow
Share This