Utility Computing Tears Down the Water Wheel

By By JT Litchfield, Conference Manager and Editor, Xtalks

April 23, 2007

Utility computing has sparked imaginations. Eliminating capital investment, provisioning resources dynamically and paying as you go for what you actually use are very attractive features. Recovering automatically from hardware failures alone could save many enterprises millions. Many considered this Utopian concept unobtainable, but academics have operated this way for more than a decade through the use of grid computing. Unfortunately, the benefits of grid computing were limited to technical computations and required code written specifically to run on the grid. Mainstream business applications have been locked out. Until now.

Recent advances in grid technology have enabled the first true utility computing services. For the first time, companies can build, deploy and scale complex online systems without owning or operating a single server. More importantly, this simplification allows enterprises to regain control of their IT platforms, enabling them to accelerate business-critical updates, features and additions in support of revenue objectives. SaaS and Web 2.0 companies, along with forward-looking enterprises, already are starting to take advantage of this new business model. Gone are the time, cost and hardware barriers associated with bringing transactional Web applications to market and scaling them as the service demand grows.

In an Xtalks Web conference on April 11, Nicholas Carr, former executive editor of the Harvard Business Review and acclaimed business writer, discussed the emergence of grid computing in a broader, historical context. Analyzing the rise of what Carr called the “Third Age” of IT from a strategic and economic standpoint, he took his audience through an IT timeline aiming to explain the rise and ultimate need of grid computing by comparing the modern IT sector to the water wheel of industrialization.

At the very beginning of the Industrial Revolution, manufacturers had to rely on in-house power sources to supply themselves with the vast quantities of power needed to run their factories (e.g., wind mills, water wheels, etc.). This model was inherently inefficient, forcing companies to spend a good portion of their budgets on labor and maintenance to keep their power sources humming. Carr pointed out that today's IT model suffers, more or less, from the same symptoms. Manufacturers found a solution in the utility model of electricity. All of a sudden, they were able to plug into a grid and access the power they needed on-demand and at a much lower cost.

This allusion led the introduction of Carr's first principle of any business model: “The supply of any business resource will gravitate toward its most economically efficient model.” The question he then put to his audience was whether or not current business models were the most efficient for IT supply. Obviously, he believes they are not.

In what Carr labeled the “Second Age” of IT, the current age, businesses rely on what compares easily to what early manufacturers had to rely on for their in-house power sources: the water wheel compared to client-servers. The PC, while extremely personable, also is very inefficient by nature. Businesses have to rely on the same hardware and software as their competitors and hire the same kind of people to deal with these apparatuses, resulting in huge amounts of redundancy. What's worse, Carr explained, is that it's all amplified by this tight connection between hardware and software, the physical and logical aspects of IT.

All of this leads to vast diseconomies in software and labor, and there arises a failure to capitalize on software's economies of size. Carr asserts that, currently, 60 percent of IT labor consists of routine maintenance and support, which ultimately leads to a massive drain on management time. Just like manufacturers at the dawn of the Industrial Revolution, companies now have to focus on their products/services, plus run an internal IT department. He also points out that the price of private IT infrastructure comes out to 50 percent of most companies' total equipment investment. In short, he believes that the current model of client-server business computing is hamstringing IT.

Is there a better way? Carr believes that the client-server model will be replaced, and already is, with a utility model or grid model, just as the water wheel was replaced with a utility electricity model, a centralized shared infrastructure.
Looking at IT from a purely economic standpoint, Carr believes we can begin to see why the utility computing model is so compelling. Carr explains that IT is a general-purpose technology, not a tool itself, but a platform for them. It has vast potential for economies of scale if supply can be consolidated and centralized. Consolidation, however, requires new technologies and fresh business. Fortunately, utility technologies are maturing, according to Carr. For the utility computing model to be successful, there needs to be a “grid” of technologies in place — high-speed Internet networks, data-processing dynamos like Google's new datacenters, virtualized multi-tenant infrastructure, IT automation so software can replace redundant labor, and sensible metering and pricing. As a result, virtualization will allow IT to break the lock between hardware and software. A “best of both worlds” scenario will unfold through the high efficiency of utility grids and the personalization of the client-server model.

This theoretical IT utopia, although slowly becoming reality, requires precise managing of the transition, Carr pointed out. There are three main tensions to be aware of moving forward: dedicated software versus utility software; internal utility versus external; and how to determine if your company should be a leader or a follower as this evolution unfolds. What should be outsourced and what should be created internally? Depending on the wisdom management displays, although there will be many challenges along the way, Carr asserted there will be huge opportunities and benefits.

A former executive editor of the Harvard Business Review, Carr is an acclaimed business writer and speaker whose work centers on strategy, innovation and technology. His 2004 book “Does IT Matter? Information Technology and the Corrosion of Competitive Advantage,” published by Harvard Business School Press, set off a worldwide debate about the role of computers in business.

In addition to writing more than a dozen articles and interviews for Harvard Business Review, Carr has written for the New York Times, Financial Times, MIT Sloan Management Review, Wired, Business 2.0, The Banker and Journal of Business Strategy. He writes a column on innovation for Strategy & Business, where he's a contributing editor, and publishes the popular blog Rough Type.

Carr has appeared as a business commentator on CNN, CNBC, BBC Radio and National Public Radio and is a sought after speaker on information technology. He holds a B.S. from Dartmouth College and an M.A. from Harvard University.

If you would like to listen to Carr's Xtalks presentation in full, please visit http://xtalks.com/gridcomputing0704.ashx.

Xtalks is part of the Honeycomb Worldwide group of companies. Xtalks develops and presents objective, practitioner-led Web conferences focusing on current issues and trends and within the IT industry. For a complete listing of upcoming IT focused seminars, visit www.Xtalks.com

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, remain in first and second place. The only new entrants in t Read more…

By Tiffany Trader

ScaleMatrix and Nvidia Launch ‘Deploy Anywhere’ DGX HPC and AI in a Controlled Enclosure

November 18, 2019

HPC and AI in a phone booth: ScaleMatrix and Nvidia announced today at the SC19 conference in Denver a joint offering that puts up to 13 petaflops of Nvidia DGX-1 compute power in an air conditioned, water-cooled ScaleMa Read more…

By Doug Black

HPE and NREL Collaborate on AI Ops to Accelerate Exascale Efficiency and Resilience

November 18, 2019

The ever-expanding complexity of high-performance computing continues to elevate the concerns posed by massive energy consumption and increasing points of failure. Now, the AI Ops collaboration between Hewlett Packard En Read more…

By Oliver Peckham

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first planned U.S. exascale computer. Intel also provided a glimpse of Read more…

By John Russell

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutting for the Expo Hall opening is Monday at 6:45pm, with the Read more…

By Tiffany Trader

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Data Management – The Key to a Successful AI Project

 

Five characteristics of an awesome AI data infrastructure

[Attend the IBM LSF & HPC User Group Meeting at SC19 in Denver on November 19!]

AI is powered by data

While neural networks seem to get all the glory, data is the unsung hero of AI projects – data lies at the heart of everything from model training to tuning to selection to validation. Read more…

SC19’s HPC Impact Showcase Chair: AI + HPC a ‘Speed Train’

November 16, 2019

This year’s chair of the HPC Impact Showcase at the SC19 conference in Denver is Lori Diachin, who has spent her career at the spearhead of HPC. Currently deputy director for the U.S. Department of Energy’s (DOE) Exascale Computing Project (ECP), Diachin is also... Read more…

By Doug Black

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, Read more…

By Tiffany Trader

ScaleMatrix and Nvidia Launch ‘Deploy Anywhere’ DGX HPC and AI in a Controlled Enclosure

November 18, 2019

HPC and AI in a phone booth: ScaleMatrix and Nvidia announced today at the SC19 conference in Denver a joint offering that puts up to 13 petaflops of Nvidia DGX Read more…

By Doug Black

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

SC19’s HPC Impact Showcase Chair: AI + HPC a ‘Speed Train’

November 16, 2019

This year’s chair of the HPC Impact Showcase at the SC19 conference in Denver is Lori Diachin, who has spent her career at the spearhead of HPC. Currently deputy director for the U.S. Department of Energy’s (DOE) Exascale Computing Project (ECP), Diachin is also... Read more…

By Doug Black

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Intel AI Summit: New ‘Keem Bay’ Edge VPU, AI Product Roadmap

November 12, 2019

At its AI Summit today in San Francisco, Intel touted a raft of AI training and inference hardware for deployments ranging from cloud to edge and designed to support organizations at various points of their AI journeys. The company revealed its Movidius Myriad Vision Processing Unit (VPU)... Read more…

By Doug Black

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This