Recently, an IDC interview series (see IDC’s report on The Maturing Cloud) reported that respondents identified “pay per use” as the number one benefit of cloud computing, achieving an impressive score of 77.9 percent on a scale between 0 and 100 percent. This suggests that many believe that clouds generally offer high cost savings for all cloud users.
Immediately, the question comes to my mind whether this benefit holds true for any company, or will only a relatively small percentage of companies enjoy larger cost savings? The following simple number crunching estimates will shed light on this matter.
To make it a bit more interesting, we look at SaaS, Software as a Service, and compare a classical in-house computing environment with a comparable external “public cloud” environment.
Assume, for simplicity, for two years you buy or lease 100 CPUs (you pay $250K), software licenses ($250K), space, electricity, etc., ($100K), and you need two system administrators ($200K), resulting in $800K for two years or $4,000 per CPU (all inclusive) per year.
On the other hand, cloud service providers usually have special deals with hardware and software vendors (say 50 percent). They also save on cost for space and electricity (say 50 percent), and they certainly need fewer sysadmins (say 50 percent), all together resulting in $2,000 per CPU per year (all inclusive). If the cloud service provider adds a profit, say 100 percent, the total price per CPU per year for the cloud customer would be $4,000 — the same amount as for his in-house solution, thus resulting in no savings for the cloud customer. The end result in this case? Pay-per-use is useless.
This calculation assumes the same high utilization rate for the in-house as well as for the public cloud solution. But for most companies this is not realistic. Rather, IDC and others found out a long time ago that the average utilization of enterprise servers is more in the order of 10-20 percent. This percentage can be verified simply by assuming that employees just work 10 months per year, 5 days a week, and 8 hours a day, or 10*20*8 / 12*30*24 = 18 percent, resulting in $4,000 / 10 months / 20 days / 8 hours = $2.50 per CPU per hour of usage.
On the other hand, the cost for the same CPU for a cloud service provider would be $2,000 / 12 months / 30 days / 24 hours = $0.23, an impressive reduction factor of 11 against the in-house solution. Even if the cloud service provider adds a 100 percent profit on top, the customer would save a stunning $2.04 on each CPU hour (full cost), or $3,264 per year, for each CPU.
The above estimates present a spectrum of monetary gains for cloud customers: a gain of $2.04 per CPU per hour for a cloud customer with an average utilization of 18 percent; and, on the other end, $0.0 for those whose in-house solution achieves the same high utilization rate (of close to 100 percent) as a cloud service provider. Obviously, the monetary benefit is diminishing with increasing utilization rates. Indeed, high utilization rates are generally achieved in industries with heavy computer simulations, in their R&D departments. Examples are the automotive and aerospace industries. For those industries even an internal “private cloud” might be a questionable economic benefit.
Remark: the above Gedankenexperiment just focuses on the monetary benefit of cloud computing. I did not touch, intentionally, on other benefits (and roadblocks), which highly depend on the individual requirements of a company.