It Takes More than Virtualization to Build a Cloud

By By Derrick Harris, Editor

July 14, 2008

As the IT world struggles for nicely packaged definition of cloud computing, marketing personnel who wish to leverage its buzz are engaged in their own struggle. First, they need to determine which, if any, of their offerings can arguably be considered cloud computing under any of the various working definitions currently populating the market. If the connection is attenuated, the question becomes whether to shoehorn their solution into that definition or whether to lay off for the time being and suggest the company start implementing features that will make for a better fit.

This decision is not easy. Jumping on a blazing hot bandwagon could be a ticket to overnight success, but latching onto an unproven fad could be fatal. And what about that struggle for a universal definition? Is stretching the boundaries to include your offering detrimental in that it adds further confusion to the mix, or is there plenty of room for everyone who wants to frolic up in the clouds?

Companies with roots in the traditional managed hosting field are having to address this issue more and more as virtualization establishes itself in their offerings. As it turns out, their reasons for leveraging or not leveraging the cloud buzz are just as varied as the definitions with which they are working.

“We are cloud computing”

The cloud computing landscape is changing so fast that after an early-Spring blog delineating between his company’s offering and cloud computing offerings, GoGrid Technology Evangelist Michael Sheehan concluded that the two simply were not one in the same. Fast-forward a few months to late June, and Sheehan has changed his tune — perhaps with a little goading from President and Co-Founder John Keagy, who believes that, given the breadth of definitions out there, “for us to say that GoGrid is not cloud computing is just silly.”

One of the definitions from which Keagy draws his unequivocal determination comes from Forrester Research, which has included an underlying “grid engine” architecture as one element of a cloud. As its name might imply, GoGrid, a division of dedicated Web-hosting provider ServePath, utilizes a virtualized grid architecture to instantly deliver VMs to customers. “Grids are reincarnated in a new form, and it’s called ‘cloud computing,’” says Keagy. Sheehan concurs, adding that not only is GoGrid built using a grid architecture, but all other clouds are, as well.

Additionally, Sheehan has devised a pyramid diagram of the cloud marketplace, with the base level being infrastructure-as-a-service offerings, like Amazon’s Elastic Compute Cloud (EC2). Allowing users to launch and provision VMs with only a credit card, GoGrid provides a service that, essentially, is the same as EC2. To hear Keagy tell it, no one would deny that EC2 is a cloud, and because GoGrid is the only alternative to EC2, it, too, must be a cloud.

Taking it a step further, the folks at GoGrid actually believe their product is more true to cloud computing’s notions of openness, simplicity and flexibility than is EC2. In terms of simplicity, Sheehan says GoGrid is all about making cloud computing “less nebulous and [more] tangible to the end-user.” Whereas Amazon has an 18-minute video instructing EC2 greenhorns on how to get started, he says a Go-Grid first-timer can be up and running in five minutes using the company’s almost-too-easy GUI. (Ed. Note: He’s not lying — at least in terms of  provisioning a few machines and adding a load balancer and a database.)

In terms of flexibility, Keagy believes GoGrid’s hardware virtualization model gives users far more options than does EC2. For one, it gives users as close to a bare-metal experience as possible, even providing console access as if you are working on a dedicated Windows, Linux or Debian server, giving users, for all intents and purposes, as much control as an “old-school environment. In fact, says Keagy, a user could load software from his desktop DVD drive onto a GoGrid machine if he was so inclined.

Addressing the comparisons between infrastructure-as-a-service offerings like GoGrid and more-limited platform-as-a-service offerings like Google App Engine and Mosso, Keagy offers the following explanation: “App Engine, albeit it’s new, is very exciting if you’re a punk kid who wants to whip off a quick app fresh from scratch in your dorm room … [and] that’s obviously cloud computing … But GoGrid also is obviously cloud computing, and you can take your enterprise application, which might need a combination of servers running Debian, Linux and Windows, and a bunch of private networking features that you need to coordinate within your vLAN, and we give you root access.”

As for what makes GoGrid a cloud service provider and parent company ServePath a traditional hosting provider, Sheehan notes that on top of simply giving customers resources on which to host their applications and providing SLAs, cloud services should offer instant scalability and utility pricing, both of which describe GoGrid.

“Above all,” says Keagy, “we are cloud computing.”

All in Due Time

On the opposite side of the aisle is Layered Technologies, company that definitely could squeeze its Virtual Private Datacenter (VPDC) solutions into some definition of cloud computing and run with it, but who is not too anxious to do so.

For Todd Abrams, president and chief operating officer of Layered Tech, cloud computing means having resources distributed over multiple points of presence, but connected and pooled so that applications are not concerned with where or what they are. “It’s a big marketing thing because, honestly, I don’t see how you can say you’ve got cloud computing if you have a single location,” he explains. “In my mind, you have to have diversity.” Layered Tech is addressing this, he added, by buying and opening new datacenters across the globe, in cities like Dallas, Chicago and Tokyo.

Functionality-wise, Abrams sees easy-to-understand GUIs and utility-style billing as components of a cloud solution, and these are not currently available in the company’s VPDC offerings. However, he noted, Layered Tech is working on its GUI, its utility billing scheme and its credit card-billing mechanism, and should have a “cloud” of resources in a few months. Cloud computing, he says, is both an evolution and an extension of VPDC, but one that will materialize as technology continues to change and features continue to get added.

One reason Layered Tech isn’t gung-ho about ramping up its cloud marketing is that the company has a good thing going with VPDC and, says Abrams, he sees a good bang for his buck from Amazon’s EC2 and S3 solutions, which customers often reference when looking at Layered Tech’s GridLayer family of offerings. This could help to explain why even without calling it cloud computing, VPDC represents 10 percent of the company’s revenue and has garnered attention from “big, big” players in the financial services, credit card, medical and telecom industries.

Of course, the other reason Layered Tech isn’t marketing itself as a cloud computing provider could be that the distinction between VPDC and solutions like EC2 is a good thing. Abrams says that even if they are addressed sufficiently, enterprises always will have security concerns around shared infrastructures — it’s a mind thing. “A lot of guys in IT are server-huggers, they want to touch their servers and feel their servers,” he says. “Once you take that away from them, they don’t really know what to do, because that’s their life and their job.” To address this, VPDC users get a dedicated instance of nodes with which they can do as they please, and if one user’s nodes go down, it won’t affect other users. VPDC is part of a larger backbone, says Abrams, but instances are contained a user’s nodes.

In another deviation from cloud standard operating practices, VPDC, which ranges from $1,700 to $4,000 a month to start, isn’t cheap, acknowledges Abrams. “But … when you work out the math and what the benefits are of what’s in that Virtual Private Datacenter, if you’re going to build in the scalability, you’re going to build in the redundancy, if you’re going to have the load balancers, firewalls, etc., in that, you’re at way more than four grand a month [doing it internally or through a traditional provider like AT&T].”

Finally, Abrams sees cloud computing as being driven by applications that are completely agnostic in terms of hardware type or geographical location, but many enterprises don’t have the luxury of not caring. Compliance measures can require knowing where your data is being stored, as well as various data retention practices, and Layered Tech’s ability to address customers’ compliance needs is not within the present realm of cloud computing. For example, says Abrams, our Dallas datacenter is SAS 70-compliant. For testing and development purposes, however, Abrams sees cloud computing and solutions like Layered Tech’s VPDC developer packages being great fits.

What Layered Tech’s cloud product will look like remains to be seen, but due to the popularity of VPDC among enterprises, it appears the two solutions will be clearly distinct.

Keep ‘em Separate

Like ServePath formed GoGrid to handle its now-called cloud computing offering, so did Rackspace form Mosso to address the cloud computing market. However, unlike ServePath, Rackspace already had a virtualization offering in the same vein as Layered Tech’s VPDC. So why not use that offering as its foray into cloud computing?

According to Lew Moorman, Rackspace’s senior vice president of strategy and corporate development, the answer has a lot to do with both his definition of cloud computing and the strategic goal of attracting enterprise users. Talking about the latter, Moorman believes that, given the current state of cloud computing, highly customized and complex applications (like most mission-critical enterprise applications) are better suited for a traditional infrastructure. “The problem … is that in pretty much every single cloud … you are dealing with a multi-tenant, shared environment. And with multi-tenant and shared environments come restrictions,” he says. “If you meet them, then, man, you get all the benefits — and isn’t that great — but if you don’t, you’ve got problems. It won’t work.” He added that it will be a long time “until someone is running their Oracle accounting in the cloud.”

Aside from being multi-tenant, Moorman also defines clouds as revolving around concepts like pay per use, self service, quick provisioning and automation of key tasks, like scaling. In addition, he believes clouds must have a “very serious” software layer between the applications and underlying infrastructure to make the whole experience as seamless as possible.

Although Rackspace’s virtualization offering does offer utility billing and rapid provisioning, Moorman says the fact that it consists of dedicated virtual servers means it is not a cloud service. “We have to wrestle with how to market these things, and we just haven’t applied the ‘cloud’ label to it yet,” he says. If the offering starts to mature and Rackspace starts selling in more incrementally, then maybe the “cloud” label will make sense, but, for now, the company is selling dedicated infrastructure using virtualization as a tool.

However, Moorman notes, depending on the definition with which someone is working, the line between dedicated hosting and cloud computing can get a bit murky. In some ways, he says, the whole idea of centralized computing, of not having a server closet in your office, can be considered cloud computing. If that’s the case, the “cloud” label probably will spread over traditional hosting to the point where the two become indistinguishable. For the time being, Moorman said, Rackspace welcomes the distinction and only applies the cloud label to the Mosso offering, which has a cloud software layer and is clearly in the platform-as-a-service sector of the cloud. Overall, he believes the hosting market has done a good job with no co-opting the term.

The Expert’s View

Antonio’s Piraino’s professional life probably was running smoothly enough until cloud computing came along. Now, says Piraino, senior analyst for managed hosting with Tier 1 Research, he’s “more and more” involved with cloud computing because “that’s where there’s a bigger impact on all the managed hosters … from all those cloud infrastructures that have been put in place.”

Like the rest of us, Piraino has had to sift through the rubble to find a definition that suits his job functions. For starters, advances in virtualization have really helped to blur the line between what is managed hosting and what is cloud computing. Now, he says, managed hosting providers can tell customers, “You not only come on and get a raw VM, but if you need billing on the fly, we’ll suddenly create a billing system for you. If you suddenly need load balancing or this type of thing, we’ll do it all for you.” The result, he says, resembles glorified dedicated hosting, but being able to move stuff around on the fly and “play around” with the set-up brings a cloud element to it. However, if you ask Amazon, says Piraino, they’ll tell you it’s all about how you charge users. And IT service providers like AT&T and Verizon Business will say they provide storage on demand and utility computing capabilities, but it takes five days to provision. “[F]or an enterprise customer,” he noted, “that might be a really quick provisioning time.”

The result of Piraino’s quest to define cloud computing is a list of properties that every cloud should have. Among those properties, and where the distinction between managed hosting and cloud services becomes a little more clear, are: real-time, accessible infrastructure; scalability on the fly; utility billing model; credit card billable; and it needs to be a platform. Managed hosting solutions usually don’t have these characteristics, and usually it’s intended to be that way. For example, says Piraino, automated scalability is easier with a consistent, simple configuration than with a more complex enterprise-style configuration. Likewise, while credit card billing might eliminate the hassle of having to deal with a salesperson in order to get started, large corporations might actually want contracts and not want to be anonymous. If Wal-Mart is going to use EC2 to handle excess holiday traffic, Piraino explains, it would seem the retailer would want Amazon to know what’s coming so nothing is at risk.

A prime example of this dichotomy, he says, is Rackspace, who hasn’t wanted to get involved with automated scalability and application-aware capabilities, but who has Mosso to fill that void.

Tomayto or Tomahto, the Result is the Same

If the ultimate goal of both managed hosting and cloud computing providers is relieve IT departments of the headaches and costs associated with datacenter management, perhaps it doesn’t matter what we call them. GoGrid’s Keagy seems to think so, saying that “we’d all be wise to let cloud computing be as broad and fanatical [as possible].”

“I think what’s going on, what’s driving your career and mine, is outsourcing,” he explained. “Ninety-nine percent of IT still is not outsourced. I’m looking out across downtown San Francisco, and I just know this place is chock-a-block full of servers packed back in little closets, and this cloud computing thing is what’s going to drive all those people are buying and maintaining all those servers in all those closets … to overcome their outsourcing objections.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

What’s New in Computing vs. COVID-19: Fugaku, Congress, De Novo Design & More

July 2, 2020

Supercomputing, big data and artificial intelligence are crucial tools in the fight against the coronavirus pandemic. Around the world, researchers, corporations and governments are urgently devoting their computing reso Read more…

By Oliver Peckham

OpenPOWER Reboot – New Director, New Silicon Partners, Leveraging Linux Foundation Connections

July 2, 2020

Earlier this week the OpenPOWER Foundation announced the contribution of IBM’s A21 Power processor core design to the open source community. Roughly this time last year, IBM announced open sourcing its Power instructio Read more…

By John Russell

HPC Career Notes: July 2020 Edition

July 1, 2020

In this monthly feature, we'll keep you up-to-date on the latest career developments for individuals in the high-performance computing community. Whether it's a promotion, new company hire, or even an accolade, we've got Read more…

By Mariana Iriarte

Supercomputers Enable Radical, Promising New COVID-19 Drug Development Approach

July 1, 2020

Around the world, innumerable supercomputers are sifting through billions of molecules in a desperate search for a viable therapeutic to treat COVID-19. Those molecules are pulled from enormous databases of known compoun Read more…

By Oliver Peckham

HPC-Powered Simulations Reveal a Looming Climatic Threat to Vital Monsoon Seasons

June 30, 2020

As June draws to a close, eyes are turning to the latter half of the year – and with it, the monsoon and hurricane seasons that can prove vital or devastating for many of the world’s coastal communities. Now, climate Read more…

By Oliver Peckham

AWS Solution Channel

Maxar Builds HPC on AWS to Deliver Forecasts 58% Faster Than Weather Supercomputer

When weather threatens drilling rigs, refineries, and other energy facilities, oil and gas companies want to move fast to protect personnel and equipment. And for firms that trade commodity shares in oil, precious metals, crops, and livestock, the weather can significantly impact their buy-sell decisions. Read more…

Intel® HPC + AI Pavilion

Supercomputing the Pandemic: Scientific Community Tackles COVID-19 from Multiple Perspectives

Since their inception, supercomputers have taken on the biggest, most complex, and most data-intensive computing challenges—from confirming Einstein’s theories about gravitational waves to predicting the impacts of climate change. Read more…

Hyperion Forecast – Headwinds in 2020 Won’t Stifle Cloud HPC Adoption or Arm’s Rise

June 30, 2020

The semiannual taking of HPC’s pulse by Hyperion Research – late fall at SC and early summer at ISC – is a much-watched indicator of things come. This year is no different though the conversion of ISC to a digital Read more…

By John Russell

OpenPOWER Reboot – New Director, New Silicon Partners, Leveraging Linux Foundation Connections

July 2, 2020

Earlier this week the OpenPOWER Foundation announced the contribution of IBM’s A21 Power processor core design to the open source community. Roughly this time Read more…

By John Russell

Hyperion Forecast – Headwinds in 2020 Won’t Stifle Cloud HPC Adoption or Arm’s Rise

June 30, 2020

The semiannual taking of HPC’s pulse by Hyperion Research – late fall at SC and early summer at ISC – is a much-watched indicator of things come. This yea Read more…

By John Russell

Racism and HPC: a Special Podcast

June 29, 2020

Promoting greater diversity in HPC is a much-discussed goal and ostensibly a long-sought goal in HPC. Yet it seems clear HPC is far from achieving this goal. Re Read more…

Top500 Trends: Movement on Top, but Record Low Turnover

June 25, 2020

The 55th installment of the Top500 list saw strong activity in the leadership segment with four new systems in the top ten and a crowning achievement from the f Read more…

By Tiffany Trader

ISC 2020 Keynote: Hope for the Future, Praise for Fugaku and HPC’s Pandemic Response

June 24, 2020

In stark contrast to past years Thomas Sterling’s ISC20 keynote today struck a more somber note with the COVID-19 pandemic as the central character in Sterling’s annual review of worldwide trends in HPC. Better known for his engaging manner and occasional willingness to poke prickly egos, Sterling instead strode through the numbing statistics associated... Read more…

By John Russell

ISC 2020’s Student Cluster Competition Winners Announced

June 24, 2020

Normally, the Student Cluster Competition involves teams of students building real computing clusters on the show floors of major supercomputer conferences and Read more…

By Oliver Peckham

Hoefler’s Whirlwind ISC20 Virtual Tour of ML Trends in 9 Slides

June 23, 2020

The ISC20 experience this year via livestreaming and pre-recordings is interesting and perhaps a bit odd. That said presenters’ efforts to condense their comments makes for economic use of your time. Torsten Hoefler’s whirlwind 12-minute tour of ML is a great example. Hoefler, leader of the planned ISC20 Machine Learning... Read more…

By John Russell

At ISC, the Fight Against COVID-19 Took the Stage – and Yes, Fugaku Was There

June 23, 2020

With over nine million infected and nearly half a million dead, the COVID-19 pandemic has seized the world’s attention for several months. It has also dominat Read more…

By Oliver Peckham

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

Contributors

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This