The Virtual Evolution of Managed Services

By By Derrick Harris, Editor

March 24, 2008

There is no doubt that managed hosting has been undergoing an on-demand transformation thanks to advances in virtualization and grid technologies, but although the underlying technologies might be similar, the available services are increasingly unique — targeted at doing a few things and doing them well. This is especially true for two relatively recent entrants into the virtualized hosting fray.

Bringing Virtualization to Canada’s SMBs

Radiant Communications, a Vancouver, British Columbia-based provider of commercial broadband solutions, publicly entered the virtual space in October 2007 with its AlwaysThere hosted Exchange offering. Leveraging Radiant’s Grid Computing Utility (GCU), a collection of virtual machines combined with a flexible storage area network (SAN), the hosted Exchange offering gives users a dedicated instance of Microsoft Exchange with, according to Radiant’s director of advanced hosting, Jason Leeson, all the security and flexibility of an on-premise offering, as well as the economies of scale that come along with a shared environment. Thus far, the company has been focusing its marketing efforts around the Exchange service (specifically within the Canadian SMB market), and has been gaining a lot of traction as a result, but Leeson says that this is just the tip of the iceberg.

Radiant also is offering virtual servers (running Windows Server 2003, Red Hat Linux or any VMware-compatible OS) to customers who want the ability to scale their resources as needed, on demand, without having to invest in purchasing and managing physical machines. Leeson said the concept of renting virtual servers on a pay-per-use basis is currently showing the most opportunity for uses such as disaster recovery (i.e., backup) and business continuity (i.e., automatic failover), but some customers are actually hosting their own applications on the grid. Radiant’s grid computing environment is connected directly to its existing MPLS (Multi-Protocol Label Switching) core network, which Leeson says allows Radiant to host and deliver grid-based virtual servers and applications for each customer in a completely secure and private manner.

Concerning the latter, Leeson acknowledges that Radiant’s virtual server model is still in its infancy, but points to significant advancements on the horizon. Currently, users wishing to cluster several VMs must wire the servers together themselves, as Radiant has not yet incorporated that level of automation. Additionally, customers must add servers directly through Radiant, with servers usually provisioned and ready to go in a few hours. However, Leeson explained, Radiant is only in the first stage of a three-phase rollout: (1) standardize the grid infrastructure; (2) build the internal management tools for automation and provisioning; and (3) delegate administration, control and management to users. Once the final phase is complete — or at least underway — Leeson says customers will have the full virtual private datacenter (VPDC) experience of being able to turn up or turn down servers on demand, track server utilization, etc. He thinks these customer-side management tools and consoles will be big differentiators as Radiant continues to grow its service.

Right now, the main target for virtual servers and VPDCs is the independent IT consultant market. “We talk to a lot of IT consultants, for example, who don’t have their own datacenters [but] have niche vertical apps that they offer to their customer base,” said Leeson. “This is an opportunity for them to tap into the Radiant datacenter, and we set them up with the virtual servers and they can run what they want.” Professional service organizations, such as law firms, have been the main customers for the AlwaysThere hosted Exchange offering, added Leeson.

Hosting in the Cloud

Attacking the management problem from a different angle is Mosso, a Rackspace company that started in 2006 when co-founders Jonathan Bryce and Todd Morey had the idea to offer Rackspace’s enterprise-level technology to smaller users in a multi-tenant environment. In February of this year, Mosso introduced a revamped service — the Hosting Cloud — in an attempt to make the Web hosting experience as simple as possible without sacrificing reliability.

Leveraging a cloud of computers and VMs to offer customers as-needed scalability, Bryce describes the Hosting Cloud as “a place where developers can basically upload their code and we take care of the rest.” With this in mind, the infrastructure uses standard Web technologies like PHP, Ruby, Perl, .NET, and ASP, and users don’t do any server provisioning, as Mosso’s internally developed software manages provisioning, scaling and other aspects of the environment automatically. Once the application has been uploaded via the Web interface, $100 per month gives customers access to 500GB of bandwidth, 50GB of high-performance storage and 3 million Web requests. Scaling is done automatically as applications experience greater traffic or require more resources, and the extra resources and/or Web requests cost only “pennies”: $.50 per gigabyte of disk space; $.25 per gigabyte of bandwidth; and $.03 per 1,000 Web requests. “A lot of those other systems,” said Bryce “… make it easy to provision additional resources quickly, but they don’t necessarily do it automatically.”

As good as this all sounds, though, even Bryce acknowledges that the platform has limitations — some of which, like not allowing users administrative access, are by design. “It’s a plus because it means they don’t have responsibility for that, but it’s a minus because it means there are limitations that we put in place — you couldn’t run SAP or something on our cluster,” explained Bryce. “It’s meant to do a few things really well. It’s meant to serve Web applications and their … databases.” Essentially, if users have needs that are out of the norm for Web applications (e.g., connecting to a legacy system with custom C code), Mosso will not currently handle them within its system, as such custom installations might affect downtime or otherwise throw a wrench in the system.

The reality, says Bryce, is that there always are trade-offs when dealing with a fully managed platform. “One of the questions we get is why would someone go with Rackspace over Mosso, and that’s generally what it is,” he elaborated. “Rackspace’s customers have more complex and more customization needed to work with their overall architecture, and we generally do really well for the set of standard technologies that we support.”

Among the technologies that Mosso does not support is Java, although Bryce hopes that will change by the end of the year. Java support will come hand in hand with a “sandbox” environment Mosso is currently working on, which would allow customers administrative access of individual virtual instances without Mosso having to install any unique software across its entire pool of resources. Customers also will have more in-depth insight into how their applications are running, thanks to an improved control panel that will, among other things, allow users to access storage snapshots. In addition to these upgrades, Mosso also plans to expand into an additional Rackspace datacenter and to introduce larger base packages for customers who know they will regularly go beyond the current base quantities.

Mosso is an optimistic company, though, and Bryce firmly believes that Mosso’s pros outweigh its seemingly minimal cons. And one big thing the company has going for it is its level of service, which Bryce describes as deeper and more proactive than those of many other managed service providers. Support is available 24 hours a day via phone, e-mail or chat, says Bryce, and because everyone is in-house, customer services representatives have easy access to the technology team should they need it. “Even though everything we’re doing is high technology, we still keep a people element in it,” says Bryce. “That’s been one of the keys to Rackspace’s success over the years, and we definitely stick with that legacy.”

One example of this personal service to which Bryce points involves a recent appearance by a Mosso customer on a national network morning talk show. The customer gave Mosso a heads-up as to when it would be on, and Mosso scaled up its infrastructure in advance to avoid any potential traffic-related downtime. Generally, however, customers don’t know when the need for increased scale will arise, but that doesn’t mean there is any less service involved. According to Bryce, Mosso looks at every deviation from users’ averages and acts accordingly. If it’s just more Web traffic, then the answer if more resources. If, however, it’s a funky SQL query, the Mosso reps will play the role of database administrators and help get everything running smoothly.

Clearly, customers aren’t shying away from Mosso, as the company currently boasts more than 2,000 customers, with the majority having joined in the past year. Bryce says Mosso adds 900 applications per week to its cloud, and is hosting more than 76,000 mailboxes.

One of these customers is David Ponce, owner and managing editor of consumer technology blog Oh Gizmo, who has been with Mosso for almost a year. He was turned on to Mosso after seeking input via a post on his site, as increasing traffic and “terrible” experiences with other providers left him needing to make a change.

At one point, Ponce downgraded from one provider’s “limited” virtual private server offering to the regular grid-based offering, and while it handled his needs just fine, customer service was another story altogether. Corroborating Bryce’s account of Mosso’s level of customer service, Ponce says he has “never seen anything like it.” “I can get in touch with a human within two minutes,” he added, “and, to me, that is worth every penny.”

As far as the product’s offerings, Ponce points to some early issues with downtime following appearances on the front pages of Digg or Slashdot, but says everything has been running pretty much “perfectly” — nearly 99 percent uptime — after a couple months of tweaking the settings. “Whatever they’re doing, their clusters are working, because whenever there’s a spike, it grows and handles it just fine,” said Ponce.

Ponce also has some concerns with the pricing around Web requests, as he has been exceeding the limits lately and expects to do so occasionally in the months to come. He welcomes the opportunity to upgrade to a package offering more base requests, but notes that for the most part, Mosso Hosting Cloud is perfect for his needs, which generally involve 600,000 to 700,000 page views per month.

One Step at a Time

Regardless how many customers managed hosting providers draw or how grand their master plans, both Radiant and Mosso understand that success in the utility hosting space requires a measured approach. According to Mosso’s Bryce, truly pervasive cloud computing will only occur if today’s providers keep their foci narrow and focus on doing one thing well — managing that service “all the way up and down the stack.” Like Amazon’s S3 for storage and Mosso’s Hosting Cloud for Web applications, Bryce foresees a day when “[t]here’ll be enough of these services and enough of these technology-specific utilities that are high-quality and high-performance that most things will be running on them.”

Radiant’s Leeson sees the market unfolding in much the same way, noting that Radiant saw e-mail as a great opportunity to introduce customers to its grid-based service because e-mail is a mission-critical application that doesn’t provide any real strategic advantage to organizations. Disaster recovery and business continuity are other areas where customers, particularly SMBs, can move some tasks into the cloud without over-committing. “When we talk about SaaS or cloud computing, it’s not all or nothing,” he explained. “It’s not companies that are suddenly going to move everything into the cloud and get rid of all their on-premise stuff. It’s going to be slow, and it’s going to be gradual, and they’re going to start with certain things.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. That’s the conclusion drawn by the scientists and researcher Read more…

By Jan Rowell

What’s New in HPC Research: Cosmic Magnetism, Cryptanalysis, Car Navigation & More

November 8, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Machine Learning Fuels a Booming HPC Market

November 7, 2019

Enterprise infrastructure investments for training machine learning models have grown more than 50 percent annually over the past two years, and are expected to shortly surpass $10 billion, according to a new market fore Read more…

By George Leopold

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Atom by Atom, Supercomputers Shed Light on Alloys

November 7, 2019

Alloys are at the heart of human civilization, but developing alloys in the Information Age is much different than it was in the Bronze Age. Trial-by-error smelting has given way to the use of high-performance computing Read more…

By Oliver Peckham

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This