On the Origin of Utility Computing

By By Derrick Harris, Editor

September 24, 2007

Although it has been criticized by many analysts and experts who question its overall business model — often wondering whether mainstream users would ever be comfortable running mission-critical jobs or services externally — utility computing has not gone away.

In fact, if one were to ask around, he would very likely hear about a wide variety of utility solutions being offered by an equally varied group of vendors. If one were to dig a little deeper into these solutions, he would find that utility computing sure has matured since its early days of A offering B the ability to run jobs on A’s big collection of servers. Truth be told, “matured” might not be the right verb; it probably would be more accurate to say that, like so many technologies before it, utility computing has “evolved.”

And just like looking at the evolutionary paths of species whose current members have branched off into incarnations that barely (if at all) resemble their ancient ancestors, utility computing today takes many, sometimes unrecognizable, forms. However, unlike those species, the various incarnations of utility computing seek to do more than just survive — they seek to transform the way the world does enterprise computing.

In this two-part series, we will take a look at four distinct utility models, which, although very different aesthetically, all aim to give users on-demand access to needed resources while easing the increasingly cumbersome task of datacenter management. To start, we examine Sun’s Network.com and Amazon’s Elastic Compute Cloud (EC2), two services that tackle external utility computing in two very unique ways …

Grid Computing for the Masses

Perhaps the most familiar-looking utility model we’ll discuss here, Sun Microsystems’ Network.com offers users the ability to run their compute-intensive applications on the Sun Grid, essentially a Sun Grid Engine-powered datacenter, for the firm price of $1/CPU/hour.

When it launched in March 2006, Network.com users were required to write their own applications to fit the Solaris-based grid, which they could then get up and running via a Web interface. Since then, however, Sun has been tweaking Network.com to make it more user-friendly, most noticeably by adding an application catalog featuring a variety of applications across a range of industries. Customers using these pre-configured applications simply submit their data and the application runs — there is no need to write or rewrite code to specifically fit the Network.com infrastructure.

While this model has its fans, particularly among traditional HPC users like life sciences and modeling shops, Sun is looking for more users, which just might come thanks to an emerging couple of use cases. According to Mark Herring, director of marketing for Network.com, Sun is seeing increased interest from ISVs looking to leverage the grid’s resources to provide software services to third-party customers.

In some cases, such as with financials services ISV CDO2 and sales performance management leader Callidus, Network.com allows them a relatively inexpensive and simple way to get into the software-as-a-service market without having to host applications on their internal hardware. The formula is pretty simple: customers pay the software vendors to utilize their on-demand applications (generally via a Web portal), which actually are being run on Network.com. A similar model also is being utilized by data management company InfoSolve, who simply uses Network.com as its backend resource center. As of right now, every single data quality service InfoSolve runs for its customers is done on the Sun Grid. Although it is too early to tell, Herring believes this model could mean big business for Sun, as it offers an option for getting general computing ISVs and users on board in a highly transparent manner.

For Sun, though, its aspirations don’t end with a new usage model for Network.com; the company is searching for the elusive “killer app” that will do for on-demand computing what Google Maps did for Ajax. According to Herring, while the near-term goals for Network.com are to bring more applications into its catalog — particularly in the life sciences area — the company is hearing “murmurs and noise” suggesting there is a demand for the ability to run non-grid-enabled applications on the Network.com infrastructure, and Sun also is thinking about working development and office productivity tools into the fold.

The reason for this, said Herring, lies in the presumption that what we call “utility” today is “going to take more and more of the lion’s share of computing, period.” Sun doesn’t believe that a one-size-fits-all approach to the utility market will suffice, so now that Network.com has grid under its belt, it can start looking at other models, such as more general hosting, storage farms and Google-type software applications. Although Herring can’t elaborate on details, he noted that some of these potential services are currently being demoed internally.

“We definitely don’t look at Network.com and say, ‘Hey, we’re done here. We’ve solved utility computing,’” said Herring. “We’ve solved a piece of it, [but] there’s a lot more pieces and I think the only thing that creates a complete solution is to have each one of those use cases taken care of.”

Bare Metal, Web Services and ‘Elasticity’

Of all the utility services being offered today — outsourced or in-house, virtualized or physical — the one with the most buzz surrounding it has to be Amazon’s Elastic Compute Cloud (EC2). Developed initially to relieve Amazon’s many internal teams of the various “heavy lifting” tasks necessary to launch the company’s software services — tasks that consumed time and money that could have been spent delivering actual business value — the company eventually realized that the utility infrastructure in which it invested so much money could deliver real value outside of Amazon, as well.

What separates EC2 from its competitors, said Amazon CTO Werner Vogels, is that EC2, as well as its sister S3 (Simple Storage Service), is designed for developers and relies on Web services. To get started, a developer: (1) selects an Amazon Image Service (AMI), Xen-enabled Linux images ranging from standard Red Hat images to specialized images with Hadoop parallel computing or specific grid services built in, or constructs his own; (2) communicates with EC2 via Web services calls to determine how many of the AMIs to start, get the AMIs instantiated, and figure out the IP addresses and other virtual machine specs; and (3) configures security around the AMIs, deciding who can access which services. “You can do computation or you can offer a service to the outside world,” said Vogels. “Whatever you do inside these environments is all up to you.”

EC2 also sets itself apart by providing low-level services, or what Vogels calls “infrastructure-level” services, that offer access to as close to the bare metal as possible. When compared to Network.com, for example, a users AMIs in EC2 would be analogous to the physical infrastructure that comprises the Sun Grid. The big difference, however, is that EC2 users can run whatever services they want within that “infrastructure,” grid or not. According to Vogels, when combined with the dynamic nature of EC2, this freedom of services is one of the solution’s biggest draws.

As evidence of just how wide open the platform is when a little creativity is applied to it, Vogels can rattle off an expansive list of EC2 use cases, which includes, among others: grid or parallel computing; Web 2.0; testing and integration; third-party rendering; search engines; and Web crawlers. However, he said, some markets, such as those that have traditionally utilized grid or HPC technologies, move faster than others. In the case at hand, Vogels said users are finding that EC2 is not too big a step from running and managing their parallel and/or distributed datacenters.

Despite its fundamental differences from Network.com, though, the two solutions do have something in common: both are proving popular with “traditional software houses” that want to break into the software-as-a-service (SaaS) market. Although the companies see the potential of SaaS, said Vogels, they often have little operational experience beyond running their own Web sites, and they almost certainly have no experience running large-scale datacenters out of which they have to offer services. Just like for Amazon’s internal teams, EC2 allows these companies to minimize their datacenter management issues and focus on their core strengths around software development.

In addition, as noted earlier, EC2 also is sharing in the burgeoning Web 2.0 market. Just like with SaaS customers, Vogels said EC2 offers Web 2.0 firms a prime opportunity to focus on the important issues. “… [EC2] allows them to focus their scarce resources — in this case, finances — on actually acquiring talent instead of acquiring computer servers,” commented Vogels. Because these companies often only get one shot at success, he added, it is crucial that they can prepare themselves for success without making huge upfront investments in datacenter resources.

With such a wide breadth of uses, one probably shouldn’t be surprised to learn that EC2, which currently is in a limited beta mode, is experiencing “almost unlimited demand,” with presently available resources continuously in use and a long line of interested customers. From Vogels’ point of view, this demand should only continue to grow, as users really like on-demand resources and love paying only for actual usage. A low barrier to entry doesn’t hurt, either.

Citing Wall Street firms and government agencies as real-life examples, Vogels said many of EC2’s heaviest users came on board just to experiment and ended up getting hooked. “[A]s long as you’re a developer with a credit card,” Vogels summarized, “you can do this.”

—–

Be sure to watch for next week’s issue, where we will present two solutions that turn utility computing on its head by bringing this traditionally external practice in-house.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. That’s the conclusion drawn by the scientists and researcher Read more…

By Jan Rowell

What’s New in HPC Research: Cosmic Magnetism, Cryptanalysis, Car Navigation & More

November 8, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Machine Learning Fuels a Booming HPC Market

November 7, 2019

Enterprise infrastructure investments for training machine learning models have grown more than 50 percent annually over the past two years, and are expected to shortly surpass $10 billion, according to a new market fore Read more…

By George Leopold

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Atom by Atom, Supercomputers Shed Light on Alloys

November 7, 2019

Alloys are at the heart of human civilization, but developing alloys in the Information Age is much different than it was in the Bronze Age. Trial-by-error smelting has given way to the use of high-performance computing Read more…

By Oliver Peckham

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This