The 15 Percent Solution

By By Tom Gibbs, Contributing Author

February 27, 2006

Tom Gibbs, Director, Worldwide Strategy and Planning, Sales and Marketing Group, Intel Corp.Sir Arthur Conan Doyle came up with the idea of a 7 percent solution for his hero Sherlock Holmes to maintain his mental energy and overcome the stress of sleuthing through the fog of Victorian London. The stimulant I want to discuss is computing infrastructure, and the 15 percent of it that is utilized by the average business around the world to maintain their edge or, in the best case, stimulate real business advantage. If I were a pessimist, I'd describe this as the 85 percent anchor on the ship of progress, but I'm an inveterate optimist. It's an interesting coincidence that the amount of IT budget that is available to invest in innovation is also roughly 15 percent. As coincidental and interesting as these numbers might be, I find myself mustering all of the optimism possible to find a positive spin here.

These statistics are embarrassing, if not downright scary, and illustrate the reason there is so much energy around the market phenomenon and community of users and developers using the term “Grid” to describe their approaches to IT solutions. This group of companies and individuals, and the technology they are developing, offers hope that we can improve the utility of both the infrastructure and IT budget that is used to fund it. Without their efforts, a critical component in the global economic engine could throw a rod and along with it stall the continuous march in productivity benefits we have enjoyed for the last 25 years. While the initial intent of the Grid community might have been high-end science, the real end benefit might be the salvation of corporate computing.

While the immediate future might be ominous, it's been amazing to see the economic benefits that have accrued from the convergence of computing and communications over the past five years. As a wise man said: “If it doesn't kill you, it will make you stronger” — the dotcom bust weeded out the wheat from the chaff, and the hyper-investments in infrastructure of the late '90s are now paying off.

Much like the over-investment in railroad capacity at the end of the last millennium, IT is the track on which ideas run, and the importance in a knowledge economy cannot be overstated. The Wall Street Journal is now tracking black Monday — the Monday after the Thanksgiving holiday, when folks go back to work and start to shop online (during lunch breaks of course) — as closely as Black Friday — the day immediately after Thanksgiving, which is typically the biggest brick and mortar shopping day of the year. Shopping over the internet has become as important as many of us proclaimed a few years ago. Companies in multiple industries are seeing double digit improvements in productivity with wireless mobile usage models, while some of the top 100 suppliers to Wal-Mart who made structural changes along with their RFID implementations are seeing 30 percent improvements to top line revenue growth. The open question is this: Can this era of positive benefits from IT continue?

Harvard economist Erik Brynjolffson and his colleague Lorin Hitt, from Wharton, have modeled the ROI of IT and concluded that it generally takes five years to see real payback. Hence, the benefits we saw from 2002 to 2005 were at some level the result of the hyper-investment in IT infrastructure from 1997 to 2000. For the past five years, IT investment has been flat to down as companies slashed cost. The 15 percent solution says that companies can probably eke out gains for the next couple of years, but my premise is that the demand will far outstrip supply at this level of investment and, given marginal utility at 15 percent, IT will hit the wall like a marathon runner. Lest we forget, the original marathon runner died as he finished. I think IT might suffer a similar fate if it doesn't change its diet and training regimen soon.

To put this in perspective, let's look at some data. Businesses and governments worldwide invest about $1.2 trillion in IT every year. If we round a little and exchange greenbacks for Euros, it's an even trillion. €1 trillion is about 40 percent less than the total gross domestic product of the United Kingdom or France, midway between the total GDP of Canada and Mexico, and almost identical to the GDP of Russia. To be spectacularly redundant, a trillion is a lot of anything and a hot load of Euros. Some analysts peg IT investment at about 49 percent of overall capital investment in mature economies. So it's a lot, but it isn't going to grow very much. IT investment has been roughly flat to down for the last five years and is forecasted to track or slightly lag GDP growth for the next three years.

This investment level is huge. One can debate whether companies are gaining competitive advantage relative to each other, but the verdict is generally that IT is one of the key technology ingredients to economic growth overall. The economist who figured this out, John Merton Solow, won the John Bates Clark medal in 1961 and Nobel prize in economics in 1987. As a side note, have you ever wondered why famous economists usually have three names? Seriously: John Maynard Keynes, John Kenneth Galbraith, John Merton Solow and the list goes on.

Michael Mandel, on the other hand, has only two names — and is not named John. However, he has overcome this limitation to become the chief economic editor at Business Week. In his book Rational Exuberance, he points out the benefits to the overall economy in terms of productivity gains and commensurate increase in GDP that would not have been possible without continual improvements in technology and high technology in particular. Those companies and economic regions that invested in technology and IT infrastructure did well. And those that didn't? Well, they didn't do so well.

In summary, more money than the GDP of many large nations is spent each year on assets that generate significant benefit but are more idle than the average house pet. It's no wonder Nick Carr wrote an article for the Harvard Business Review, and a subsequent book, with the premise that IT doesn't matter. Imagine if a manufacturing company utilized only 15 percent of its factory infrastructure? Can you imagine the reaction of, say, T. Boone Pickens or Carl Icahn? I have a vivid image in my mind of Michael Douglas, playing the character Gordon Gecko in the movie Wall Street, describing why the executive leadership of said company was sucking the blood out of shareholders value while demanding complete restructuring of the board and executive management. I don't see Warren Buffet inviting executives who delivered 15 percent utilization of their factories to jump on his plane to join his buddy William Gates III in an upcoming celebrity bridge tournament.

It doesn't take the winner of the John Bates Clark medal to see the gross inefficiency here and extrapolate the impact on the business processes it is intended to improve. When infrastructure consumes half of every Euro of capital investment and delivers only 15 percent utilization, there is a big problem. John Bates' claim to fame was the theory of “marginal productivity,” which established a foundational element in the basic theory of capitalism. Even in Bates' time, 15 percent utility was marginal. There is a whole body of economic theory devoted to macro and micro economic efficiency, but this situation is too obvious to weigh down with those details.

Bentley's second law of economics states: “The only thing more dangerous than an economist is an amateur economist.” I'm not sure what Bentley's first law states, but the third law almost certainly covers the fact that amateurs should never ever publish their hypotheses. So, along with being an optimist, I'm a thrill seeker and will attempt to forecast the implications of the current macroeconomic investment in IT, the demand function and evolution in IT with the current state of architecture.

I'll begin by grabbing a crayon (Peter Lynch, the famous trader who started the Fidelity Magellan Fund, claimed that all investment ideas should be simple enough to describe with a crayon. Who am I to argue?) and a used envelope from a Christmas card I received last year. The hypothesis is simple: If legacy IT architecture utilization is fixed at 15 percent, and overall budget is growing at a best-case rate of 5 percent, the amount of budget required to keep the lights on increases at 2 percent each year, while demand is increasing due to an increased need to mobilize and digitize the workforce and engage in e-commerce with business partners and customers. The demand variable might be controversial, as solid data is hard to come by. Some might say that the solution is to keep usage in check in the near term. Essentially, to throttle innovation to levels that can be supported within the other two constraints. Unfortunately, the global economy is dynamic and won't stand still for static levels of innovation. In other words: In an unconstrained world, the demand would be increasing even faster, but is artificially throttled due to the limitations of the existing infrastructure. Guess what? Most of the new entrants in to the worldwide economy are unconstrained and investment inflows are at record levels. If the times are a changin', the competition is a' increasin'.

Leading indicators are raising their ugly heads from companies who are introducing wireless mobility into their workforce or completing RFID technology pilots, which I referred to earlier. In many cases, they are seeing double digit improvements in productivity or top line revenue, but can't deploy on an enterprise-wide basis. Why not? The usual cop-out is security and privacy, and it has the same rhythm and backbeat of a Ray Charles song: “I need some security healing … c'mon I can get some privacy right across town … it feels so good in the mornin' with the curtains down.” Sounds good, feels good, but it is ultimately a distraction. IT can't meet the pent-up urge of new levels of productivity or comparative advantages because they can't afford it, and the current infrastructure can't handle the event-driven workloads, most of which originate outside the legacy firewall.

Anecdotal evidence is showing cracks in one of our economic cornerstones. It's time to go back to the crayons and envelope. Voila! The global IT economy ends in 2012. Oops, or is that 2021? Damn crayons. Well, kinder tools or not, you don't have to be Nostradamus or Alan Greenspan to forecast the problems coming for IT. One of the motivations for Bentley's Law is that amateur, and some professional, economic forecasters do not factor in improvements in basic technology. The classic example here is the forecast at the turn of the 19th century that New York City would be buried in horse manure in less than a generation. They missed the impact of World War I and the eventuality of the horseless carriage.

Whither Grid? The fundamental principal of Grid computing is virtualization, and simplify the underlying infrastructure so that it can be fully utilized. By utilizing some of the autonomic tools that are required to manage a large scale Grid infrastructure, IT can start to whittle away at the bloated overhead associated with legacy architectures. As the utility is improved, we can simultaneously reduce waste and invest in innovation.

I often get the question, “Will Grid ever become part of mainstream enterprise IT?” The question is almost amusing, in a Lenny Bruce sort of way. My answer, with a dose of Lenny's attitude, is: “Holy crap! How can enterprise IT survive without adopting the technologies and architectural concepts of the Grid community?” Otherwise, to paraphrase Lenny (who at the time was deriding telecommunications monopolies), you end up like a schmuck with a Dixie cup on a string!

It's kind of a harsh approach to advocate Grid solutions this way, but it's time to “rip and replace.” These are three words you will never hear from a vendor. No one ever closed a deal by scaring the hell out of their customer. But the crayon doesn't lie. The strategies of “surround and conquer” or “embrace and extend” might have worked if incremental investments had continued at dotcom boom levels, but they haven't. We've just finished the fifth year of lagging incremental investment and the cracks are already showing. It's time to start to make some dramatic changes right now. Get the standards right, stop any nonsensical vendor infighting and get on with the hard task of overhauling legacy with Grid solutions. Economic prosperity hangs in the balance.

About Tom Gibbs

Tom Gibbs is director of worldwide strategy and planning in the sales and marketing group at Intel Corp. He is responsible for developing global industry marketing strategies, building cooperative market development, and marketing campaigns with Intel's partners worldwide. Gibbs joined Intel in 1991 in the Scalable Systems division as a sales segment manager. He then worked in Intel's Enterprise Server group, where he was responsible for business growth with all OEM customers with products that scaled greater than 4-way. Finally, just prior to joining the Solutions Market Development group, he was in the Workstation Products group — responsible for all board and system product development and sales. Prior to Intel, Gibbs held technical marketing management and industry sales management positions with FPS Computing, and engineering design and development for airborne radar systems at Hughes Aircraft Company. He is a graduate in electrical engineering from California Polytechnic University in San Luis Obispo and was a member of the graduate fellowship program at Hughes Aircraft Company, where his areas of study included non-linear control systems, artificial intelligence and stochastic processes. He also previously served on the President's Information Technology Advisory Council for open source computing.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire