HPC Carries Server Market in Third Quarter

By Michael Feldman

December 7, 2007

IDC painted a rosy picture for the HPC server market in the third quarter of 2007. According to the analyst firm, revenue for the HPC market grew 8.8 percent from Q2 and 18 percent compared to the same period last year. The total HPC server revenue in Q3 worked out to $3.0 billion, which represents an astounding 22.9 percent of the $13.1 billion worldwide server revenue in Q3, a 0.5 percent increase from Q2 and a 6.3 percent increase from Q3 2006.

Another way to look at it is that non-HPC server revenue declined slightly in the third quarter and only managed anemic year over year growth. Without HPC to buoy it up, enterprise server sales are flagging. It’s pretty clear that virtualization is asserting downward pressure on the overall server market, allowing users to get more application mileage out of less hardware. But at this point, high performance computing has resisted the effects of virtualization. The typical HPC application demands more compute cycles than are typically available in a single server, so sharing that server would defeat the purpose. Today, HPC application virtualization only makes sense over an entire server grid, but that model doesn’t explicitly constrain server purchases.

Embedded in the surging HPC revenue are the processor sales. In 2006, HPC systems accounted for 26 percent of all processors sold in the server market. Intel’s latest push to position its new “Harpertown” and “Wolfdale” Penryn server chips as HPC processors and AMD’s struggles to feed the demand for its new quad-core Opterons are two indications of how important the HPC space has become to the chipmakers. While both Intel and AMD are busily adding virtualization support in their hardware, at some level they probably wish virtualization would just go away, since it reduces demand for their server chips. In HPC though, they see the opportunity to feed a beast with an insatiable appetite for compute cycles.

IDC attributes much of the HPC revenue growth to lower entry prices. The fastest growing segment is at the low end — systems priced under $50,000. With more powerful processors available, workgroups, and even individual engineers and researchers can purchase HPC systems for as little as $10,000 dollars. IDC projects the sub-$50,000 segment to have an 11.4 percent CAGR through 2011.

The other big driver IDC identified is the growing trend of replacing physical R&D with computer simulation and modeling. Even if HPC system prices were stagnant, there is a cost benefit of using computing to replace physical experimentation, since the latter is almost always labor and capital intensive.

Looking at the yearly HPC revenue, IDC reported server revenue topped $10 billion in 2006. According to them, adding in other elements of the ecosystem like storage and services pushes the total to $16.3 billion. Even this figure may be conservative. Tabor Research, using an end-user research approach, estimates HPC server revenue represents only 43 percent of the total spending, even excluding things like staff, facilities and power/cooling costs. Using IDC’s $10 billion server figure, the Tabor Research model would yield something closer to $23.5 billion in total spending. Now you’re talking real money.

Of course, all of this growth is riding on the popularity of cluster computing systems, which in the third quarter represented 68 percent of all HPC server revenue. With no competing architecture on the horizon offering comparable price/performance for the majority of applications, clusters are destined to maintain their dominance in HPC for the foreseeable future.

Which brings us to something of a paradox. If clusters are such a growth industry, why aren’t there more publicly traded cluster computing system vendors? The big vendors, like IBM, HP, Dell and Sun Microsystems, are public, but the HPC server business is just a slice of a much larger set of offerings. All of the HPC-only cluster system vendors, such as Appro, Linux Networx, Penguin Computing, etc., are privately held.

Supermicro Inc., a company that sells a range of all-purpose x86 servers, some of which make it into the HPC market, launched its IPO in March 2007. Although the company is profitable, the current stock is sitting at about $8.50/share (the initial offering was $8.00/share). With double-digit HPC cluster growth, one might have predicted better results. Time will tell.

The closest thing we have to a pure HPC cluster company that’s publicly owned is SGI, although their globally shared memory Itanium-based Altix line technically disqualifies them. Plus, the company offers storage and visualization products and also generates significant revenue from services. In any case, as I reported last week, SGI’s year-old cluster business is just now attempting to reach escape velocity and turn a profitable quarter. Their stock is currently just above $17/share, following an April high of $30.

Rackable Systems is another publicly traded cluster vendor, but, like Supermicro, it targets a much wider audience than HPC. The company also offers a number of storage products. Despite some innovative engineering in power and cooling, Rackable is finding it tough going in the cluster marketplace after a banner year in 2006. The company’s stock is in no better shape than SGI’s — worse actually. It’s now hovering at around $10/share, down from a January high of over $30.

Theoretically, the tier two players who specialize in cluster systems should be able to compete effectively against the tier one vendors, by either adding value, undercutting prices, or both. But the commodity nature of cluster computing cuts both ways. It makes entering the market easy, but establishing long-term differentiation hard. Almost everyone is building these machines from the same commodity parts: x86 processors, standard memory chips, Linux (or Windows) software, and Ethernet or InfiniBand gear. Higher value features that address usability, ease of deployment, and manageability are only now starting to be perceived as equally important as raw performance. With thin profit margins on the hardware, vendors are using these higher value features as their “secret sauce.”

The tough competition won’t dissuade vendors, though. With IDC projecting $15 billion in yearly HPC server revenue by 2011, there will be plenty of companies satisfied to get a slice of the market. The fun is just beginning.

—–

As always, comments about HPCwire are welcomed and encouraged. Write to me, Michael Feldman, at [email protected].

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Supercomputer Research Reveals Star Cluster Born Outside Our Galaxy

July 11, 2020

The Milky Way is our galactic home, containing our solar system and continuing into a giant band of densely packed stars that stretches across clear night skies around the world – but, it turns out, not all of those st Read more…

By Oliver Peckham

Max Planck Society Begins Installation of Liquid-Cooled Supercomputer from Lenovo

July 9, 2020

Lenovo announced today that it is supplying a new high performance computer to the Max Planck Society, one of Germany's premier research organizations. Comprised of Intel Xeon processors and Nvidia A100 GPUs, and featuri Read more…

By Tiffany Trader

Xilinx Announces First Adaptive Computing Challenge

July 9, 2020

A new contest is challenging the computing world. Xilinx has announced the first Xilinx Adaptive Computing Challenge, a competition that will task developers and startups with finding creative workload acceleration solutions. Xilinx is running the Adaptive Computing Challenge in partnership with Hackster.io, a developing community... Read more…

By Staff report

Reviving Moore’s Law? LBNL Researchers See Promise in Heterostructure Oxides

July 9, 2020

The reality of Moore’s law’s decline is no longer doubted for good empirical reasons. That said, never say never. Recent work by Lawrence Berkeley National Laboratory researchers suggests heterostructure oxides may b Read more…

By John Russell

President’s Council Targets AI, Quantum, STEM; Recommends Spending Growth

July 9, 2020

Last week the President Council of Advisors on Science and Technology (PCAST) met (webinar) to review policy recommendations around three sub-committee reports: 1) Industries of the Future (IotF), chaired be Dario Gil (d Read more…

By John Russell

AWS Solution Channel

Best Practices for Running Computational Fluid Dynamics (CFD) Workloads on AWS

The scalable nature and variable demand of CFD workloads makes them well-suited for a cloud computing environment. Many of the AWS instance types, such as the compute family instance types, are designed to include support for this type of workload.  Read more…

Intel® HPC + AI Pavilion

Supercomputing the Pandemic: Scientific Community Tackles COVID-19 from Multiple Perspectives

Since their inception, supercomputers have taken on the biggest, most complex, and most data-intensive computing challenges—from confirming Einstein’s theories about gravitational waves to predicting the impacts of climate change. Read more…

Penguin Computing Brings Cascade Lake-AP to OCP Form Factor

July 7, 2020

Penguin Computing, a subsidiary of SMART Global Holdings, Inc., announced yesterday (July 6) a new Tundra server, Tundra AP, that is the first to implement the Intel Xeon Scalable 9200 series processors (codenamed Cascad Read more…

By Tiffany Trader

Max Planck Society Begins Installation of Liquid-Cooled Supercomputer from Lenovo

July 9, 2020

Lenovo announced today that it is supplying a new high performance computer to the Max Planck Society, one of Germany's premier research organizations. Comprise Read more…

By Tiffany Trader

President’s Council Targets AI, Quantum, STEM; Recommends Spending Growth

July 9, 2020

Last week the President Council of Advisors on Science and Technology (PCAST) met (webinar) to review policy recommendations around three sub-committee reports: Read more…

By John Russell

Google Cloud Debuts 16-GPU Ampere A100 Instances

July 7, 2020

On the heels of the Nvidia’s Ampere A100 GPU launch in May, Google Cloud is announcing alpha availability of the A100 “Accelerator Optimized” VM A2 instance family on Google Compute Engine. The instances are powered by the HGX A100 16-GPU platform, which combines two HGX A100 8-GPU baseboards using... Read more…

By Tiffany Trader

Q&A: HLRS’s Bastian Koller Tackles HPC and Industry in Germany and Europe

July 6, 2020

In this exclusive interview for HPCwire – sadly not face to face – Steve Conway, senior advisor for Hyperion Research, talks with Dr.-Ing Bastian Koller about the state of HPC and its collaboration with Industry in Europe. Koller is a familiar figure in HPC. He is the managing director at High Performance Computing Center Stuttgart (HLRS) and also serves... Read more…

By Steve Conway, Hyperion

OpenPOWER Reboot – New Director, New Silicon Partners, Leveraging Linux Foundation Connections

July 2, 2020

Earlier this week the OpenPOWER Foundation announced the contribution of IBM’s A21 Power processor core design to the open source community. Roughly this time Read more…

By John Russell

Hyperion Forecast – Headwinds in 2020 Won’t Stifle Cloud HPC Adoption or Arm’s Rise

June 30, 2020

The semiannual taking of HPC’s pulse by Hyperion Research – late fall at SC and early summer at ISC – is a much-watched indicator of things come. This yea Read more…

By John Russell

Racism and HPC: a Special Podcast

June 29, 2020

Promoting greater diversity in HPC is a much-discussed goal and ostensibly a long-sought goal in HPC. Yet it seems clear HPC is far from achieving this goal. Re Read more…

Top500 Trends: Movement on Top, but Record Low Turnover

June 25, 2020

The 55th installment of the Top500 list saw strong activity in the leadership segment with four new systems in the top ten and a crowning achievement from the f Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Leading Solution Providers

Contributors

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This