Defending the Notion of a Utility Future
We’ve been talking about “The Big Switch” for a few weeks now, so we figured it’s about time we interviewed its author, Nicholas Carr, who took the time last week to discuss how some of the predictions he makes in the book will play out in the world of enterprise IT.
I’m not going to spend a lot of time here summarizing the interview and pointing out its highlights (because I assume if you’re reading this you’re going to read the article, as well), but I do want to add some perspective to Carr’s predictions that enterprise datacenters will eventually go the way of the dodo. It is my understanding that many anonymous commenters have berated Carr on other Web sites, questioning his credentials to speak at all about the future of computing and claiming that datacenters will never die because companies have spent way too much money on these infrastructures to just toss them aside and go to a utility model. However, what these folks might be missing — and what Carr makes clear in this interview — is that Carr is not predicting the demise of the datacenter any time soon, but rather over the course of a few decades. Also, even Carr acknowledges that not everything will be done over the Web, and big companies, in particular, will likely utilize a mix of external and local computing resources.
Now, I’m no IT historian, but it seems unlikely to me that CIOs over the past couple of decades foresaw the advent of the current virtualization age yet continued to stockpile servers knowing full well they would be scrambling to consolidate at some point the not-too-distant future. Today, however, whether the reasons are simplicity, higher utilization or just using less electricity, companies are scrambling to virtualize their datacenters and pare down the IT infrastructures in which they invested so heavily just a few years earlier. Of course, virtualization brings with it a lot more capabilities than just consolidating multiple applications to one server, and these other capabilities, such as on-demand access priority-based management, go along way (as Carr points out) toward getting users comfortable with the benefits of utility computing.
The increasing number of software-as-a-service applications goes a long way toward making Carr’s predictions look pretty accurate, as well. Ranging from Salesforce.com’s CRM software to Callidus’ on-demand sales performance management software to Demandware’s e-commerce platform, customers of all shapes and sizes already are taking advantage of on-demand applications. Plus, SMBs are increasingly moving to storage-as-a-service and computing-as-a-service environments to handle their computing needs, and providers of these services say even large companies are moving less-critical aspects of their operations to these utility platforms. Assuming these customers have positive experiences with the utility model, and assuming the providers continue to harden, expand, optimize and secure their services, is it so crazy to think companies will continue to move portions of their operations onto such platforms, eventually making a near-complete migration to utility computing?
A prime example of a large company already considering such a move is Sun, which has garnered much attention — some positive, some negative — over a series of blog entries in which one of its datacenter architects said the company hopes to eliminate its internal datacenters by 2015. Feasibility aside, Sun’s plans certainly prove that the notion of utility computing isn’t something completely unpalatable to large corporations. Granted, Sun will still offer its Network.com utility service via which Sun could, in theory, acquire its computing resources and therefore still end up (technically) managing its own datacenter, but that’s a mindbender for another day. (However, Sun is planning to expand Network.com beyond HPC offerings, so it might not be completely unreasonable.)
Speaking of IT utilities, EMC last week announced its new Fortress storage-as-a-service offering aimed at SMBs, and Infosolve added an external data enhancement service to its line of Network.com-powered data quality solutions. The aforementioned Demandware also added capabilities to its on-demand e-commerce platform. I’m not ready to anoint Carr the IT Nostradamus just yet, but the writing on the wall seems to be getting more legible.
Elsewhere in the issue, be sure to check out the following announcements: “IBM, Researchers Expand Grid-Based Cancer Project”; “Cisco Releases Highest-Performance Security Appliances”; “Microsoft Announces Virtualization Adoption Strategy”; “Brocade Adaptive Networking to Simplify Virtualization”; “Gartner Survey: 85 Percent of CIOs Expect ‘Significant Change’”; and “OGF Announces OGF22 Program.”
Finally, we’ve been seeing a lot of news around extreme transaction processing (XTP) lately, and next week we’ll have two articles that lay out what various vendors are doing in the space, as well as how Gartner has warmed to this market and these types of virtualized solutions over the past year. Be sure to check back then.
Comments about GRIDtoday are welcomed and encouraged. Write to me, Derrick Harris, at email@example.com.