10 Companies Spurring Enterprise Cloud Computing (Pt. II)

By Derrick Harris and Dennis Barker

November 21, 2008

In the first installment, we looked at the cloud computing offerings of the world’s leading IT software vendors, and why they will draw enterprise to adopt the cloud. This time, we look at companies that came of age during the Web era, some as recently as last year.

While their customers numbers range from dozens to hundreds of thousands, the one thing these companies all have in common is innovation. From enabling computing in the cloud to encouraging hybrid cloud computing to providing an end-to-end set of cloud products, these companies’ technologies strive to make cloud computing as flexible and non-threatening as possible. Given the skepticism some enterprises have about the cloud, anything that eases the process and improves its performance can only be a good thing.

 —

Skytap

While most cloud providers hold production workloads as the holy grail for proving the worth of their platforms, Skytap focuses on a more immediate goal. Understanding that provisioning test environments adds both time and costs to a company’s operations, and also that security concerns will mire many production cloud migrations, Skytap let users build virtual test labs in the cloud that mirror their physical counterparts.

Relatively young even by Web standards, Skytap customers have a library of enterprise-class operating systems and databases to choose from, as well as a variety of options for image sizes, and snapshots of configurations can be made for easy provisioning in the future. What’s more, environments can be snapshotted at any point in the process, allowing users to isolate bugs or faulty configurations and resolve them later.

“Essentially, what makes us different from other players out there is we provide absolutely industry-standard infrastructure you can run on. It’s almost like an extension of your existing IT environment,” says Ian Knox, Skytap’s director of product management.

This publication recently ran a feature on virtual lab automation, where you can get a more complete sense of what Skytap is offering and why enterprises are likely to eat it up.

3tera

When you’ve invented a grid operating system designed for scaling Web applications on demand, you’re in a pretty good position to make cloud computing work for enterprise IT. 3tera, with its AppLogic offering, enables companies to create virtualized infrastructure for running and scaling applications on a pay-to-use basis.

“Most people are looking at cloud computing mainly as a service,” says Bert Armijo, 3tera’s senior vice president of sales, marketing and product development. “But we approach cloud computing not just as a service but as a platform that enterprises can implement in their own datacenter.” 3tera sells AppLogic to hosting partners and organizations alike to turn their datacenters into service-oriented utilities for transactional and streaming applications on grids of commodity servers. In other words, 3tera helps customers create scalable virtual datacenters.

“One way to think of AppLogic is as a turnkey solution that installs on racks of servers. You get not just virtual machines, but the ability to define the infrastructure that you would’ve had to build in the datacenter — including load balancing, firewalls, all the things you need to run your applications — but without having to put in a bunch of boxes, and all your provisioning [is] managed from a Web interface. You can bring over any application with no code modification,” Armijo says.

One of AppLogic’s magic tricks is enabling users to move their complete virtual infrastructures easily from one location to another. “You can package and move a large distributed system from London to New York with a single command, and as easily as if it were a Word doc,” Armijo says.

In AppLogic, applications are assembled using self-contained software components called virtual appliances. In essence, these virtual appliances form a disposable infrastructure that the application uses while operating. When the application runs, the virtual infrastructure it needs is created dynamically on the grid, maintained, and then shut down when no longer needed.

3tera partners with 24 datacenters on five continents to serve multinationals that need services worldwide. “That gives customers the opportunity to choose where to deploy their applications,” Armijo says. “We also have partners who can provide DoD-level security, if that’s what your application requires.”

Despite concerns about security, performance and control (“Our users are control freaks,” says Armijo), 3tera sees more enterprises showing interest in cloud computing. “The thing they’re looking for is enablement: speed, getting things to market quickly, leveraging inexpensive resources for testing, flexibility of not being locked in to expensive capital budgets, not having to hire 10 new people when they have a new application, and not having to go through a committee when they want to try something new,” Armijo explains.

Not that everything is cloud-bound, though. “Nobody’s going to move their system of record outside their own datacenter walls,” Armijo says. “But there’s lots of stuff, especially front-end stuff, that makes perfect sense to put in the cloud.”

Akamai

Most people think of Akamai as the company that speeds delivery of rich media over the Web, but it is so much more. The company’s EdgePlatform, which consists of more than 36,000 servers on almost 1,000 networks and handles billions of Web transactions every day, serves as the basis for a handful of Web optimization offerings.

In fact, with its Web Application Accelerator (WAP) and EdgeComputing solutions, Akamai is both an enabler and provider of cloud computing. “The way you can think of Akamai is we’ve now leveraged our distributed network, which now consists of 36,000-plus servers all over the world,” says Neil Cohen, director of product marketing. “What that really means to someone at the enterprise is that anywhere in the world you’re trying to access an application, over 60 percent of the world’s Internet users are within a network hop away of an Akamai server, which means we’re very, very close to any application user, and we’re very, very close to wherever you’re hosting your application.”

Cohen says Akamai has “numerous” software-as-a-service customers using the WAP service to optimize the cloud for SaaS because “it brings performance, availability, scale [and] security to cloud-based computing.” The company has hundreds of customers for WAP, and Cohen says it has seen a 10x increase in SaaS traffic. Fujitsu actually leveraged the product to optimize its infrastructure-as-a-service offering. “All these major IT trends are driving enterprise apps to the Internet,” he says, “and Web App Accelerator positions very well to all those different IT initiatives.” With WAP, Cohen adds, companies not ready to move business-critical applications to the cloud can optimize Internet delivery without giving up control.

More information about WAP, as well as its do-it-yourself Configuration Manager tool can be found here.

On top of cloud enablement and cloud services (with its NetStorage offering), Akamai also does straight-up cloud computing. EdgeComputing allows customers to host their J2EE Web applications on the EdgePlatform, and, according to Cohen, “not only can a customer take their applications and run it on the Akamai cloud infrastructure, but they can pick where they want to run it close to the end-user to avoid a lot of the latency and inefficiencies associated with running that application far away from the user.” The company has a very broad class of customers for EdgeComputing, and what they really care about are performance and user experience, he says.

“We are powering a better Internet for 10 years,” explains Cohen. “We bring security, we bring performance, we bring reliability to the Internet, and we essentially bring our services to targeted enterprise customers … some of the biggest names, Fortune 500-type companies use Akamai, or 20 percent. These services are the way you can bring cloud-based services to the enterprise, whether it’s SaaS or something else.”

VMware

Cloud computing as we know it would not exist without virtualization. (Or, as John M. Willis puts it more delectably, “As flour is to a cookie, virtualization is to a cloud.”) Thus, any inquiry into enterprise clouds requires asking what VMware is up to — VMware being to enterprise server virtualization what Nabisco is to cookies. 

As it turns out, VMware is up to quite a lot. VMware’s vCloud initiative takes an inclusive, open approach that aims to help organizations build internal clouds that can also connect to external clouds when there is a demand for more capacity. The core is an assemblage of VMware’s infrastructure virtualization technologies and products, the Virtual Datacenter Operating System. VDC-OS does three very big things: pooling of servers, storage and networking resources as an on-premise cloud (Infrastructure vServices); managing of applications to ensure they get the availability, scalability and security they need (Application vServices); and handling interoperability and capacity between private and off-premise clouds (Cloud vServices) in order to meet service level agreements or other requirements. Cloud vServices include the APIs that will allow this interoperability. Another service, vCenter Chargeback, will enable a datacenter to operate on a utility basis.

VMware offers the vCloud program in three tiers. VMware Ready services are clouds from hosts or managed service providers. VMware Ready Optimized are not ready yet, but will use the Cloud vServices API and other technologies to handle running of applications in on-premise datacenters and also in off-premise clouds. VMware Ready Integrated, also still in development, will provide ways to manage between internal and external clouds.

As part of its plan to dominate the cloud computing space, VMware is teaming up with other companies to form its “vCloud ecosystem.” Partners include Rackspace, Verizon and British Telecom, which already uses VMware’s ESX as part of its virtual datacenter offering.  Aside from traditional service providers that want to offer cloud services, the ecosystem also includes new cloud providers and software developers. VMware already offers a collection of ready-to-run cloud apps through its Virtual Appliance Marketplace.

“Our product line, in particular virtual infrastructure solutions, has been enabling the pooling of physical resources and rapid provisioning of those resources to businesses units since 2003,” says William Shelton, director of Cloud Computing and Virtual Appliances at VMware. “There are cloud services built on this foundation both behind the firewall in enterprises and in the service provider community, where we have over 200 service providers in our vCloud service provider ecosystem. We build upon a proven, highly robust infrastructure of virtualization that is widely deployed.”

If you’re questioning whether VMware really is the Nabisco of virtualization if it really has the footprint and the influence to pull this off and bring users into the cloud, consider the following statistic: the company boasts more than 120,000 customers, including the entire Fortune 100 and 97 percent of the Fortune 500.

Amazon Web Services

Unquestionably the poster child for cloud computing, Amazon Web Services’ EC2 hasn’t always been a darling of enterprise customers. Nearly bare-metal virtual machines might constitute a playground for experimental developers and start-up might constitute a playground for experimental developers and start-up software companies, but limited software choices, minimal support and no SLA left enterprise critics wanting more.

Things seem to be changing. When Amazon took EC2 out of beta last month, it brought with it a 99.95 percent SLA and beta support for Windows. While neither are uniquely enterprise-centric features, there can be no underestimating the number of enterprises running Windows or the number of developers working in .NET.

And how important is that SLA? “Customers want to judge us on our performance, and overall they’ve been very pleased with our performance; we’ve had high uptime overall,” says Adam Selipsky, vice president of product management and developer relations for AWS. “Nonetheless, there are some companies, and I think this is particularly true for larger enterprises, who do want to see an SLA in place. Not, frankly, because an SLA ever compensates them for business loss, but just so they know their provider is serious and has skin in the game and will feel pain when they feel pain.”

Amazon has been rolling out enterprise-friendly features regularly throughout the past year, though. Elastic Block Storage (EBS) is like a SAN in the cloud, says Selipsky, meaning data persists regardless of the whether the associated EC2 images are running. It also keeps the storage near the processing, which Amazon hopes will encourage companies to move database-driven and data-intensive applications into the cloud.

Availability Zones provides resilience. This feature lets applications run concurrently in multiple geographic locations, the premise being that — short of an act of God — a failure-causing event in Zone A will have no effect on Zone B.

Enterprise software support, too, has matured significantly. Aside from Windows, EC2 offers multiple flavors of free and paid Linux, including Red Hat Enterprise Linux, and OpenSolaris. For databases, EC2 offers both MySQL and Oracle instances. Users interested in extreme transaction processing can leverage GigaSpaces’ eXtreme Application Platform images. These are on top of the myriad cloud database and instant-scalability solutions offered up by Amazon’s growing ecosystem of Web partners.

One of the hallmarks of cloud platforms is pay-per-use credit card billing, but Selipsky says Amazon, who tries to be the “most customer-centric company on Earth for developers”  also is willing to consider any new pricing or contractual models that enterprise customers might request.

Enterprises already are using EC2 for certain tasks. The New York Times famously uses EC2 for a variety of digital publishing initiatives, and, says Selipsky, Eli Lilly is conducting pharmaceutical research on EC2. ESPN.com runs a social networking site using EC2 in conjunction with Amazon’s Simple Storage Service (S3).

Selipsky thinks the always-growing feature list, as well as the feedback from Amazon’s 440,000 paying customers, give the company a unique combination of value proposition and anecdotal evidence of reliability. Amazon’s status as arguably the first mainstream public cloud offering also plays a role in convincing companies to join the EC2 ranks. “I don’t think there have been many other large companies with an end-to-end suite of in-the-cloud IT infrastructure services that have been out in the market at all,” says Selipsky, “never mind with a fully paid offering for over two years.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire