CTO Panel: Are Public Clouds Ripe for Mission Critical Applications?

By Nicole Hemsoth

February 15, 2011

This week we gathered the opinions of five technical leaders at cloud service companies to gauge their views on customer reception of the idea of placing mission-critical applications on public cloud resources. Chief Technical Officers from smaller public cloud-focused companies, including Stelligent, Hyperstratus, Appirio, Arcus Global,and Nube Technologies, weighed in on their sense of customer acceptance of putting core applications in the cloud.

Just as important as the initial question about viability is a secondary query—for those that did decide to send mission-critical apps to the public cloud, what was the driving factor?

A number of surveys have been conducted over the course of the past year to gauge general sentiments about placing business-critical or mission-critical applications in the cloud. More specifically, on a public cloud resource such as that offered by Amazon Web Services.

Although survey data varies according to the respondent base, the consensus seems to be that there is still quite a bit of hesitancy to place mission-critical applications in an environment where there is not a complete sense of control—not to mention concerns about data protection and location, compliance and regulatory risks, fear of lock-in…the list tends to go on.

One recent survey conducted by ESG Research found that of the 600 American and European IT professionals questioned, 42% said that public clouds would not enter into their business models in the next five years. Among the top reasons listed were, perhaps not surprisingly, data and privacy concerns (43%), loss of control (32%), existing investments in current infrastructure (also at 32%), the need to feel that the cloud ecosystem is mature before diving in (29%) and 28% responded that were satisfied with their current infrastructure currently.

While conversations with enterprise IT leaders often follow this same trajectory in terms of response, the time seemed ripe to check in with technical leaders at a number of cloud services companies to see if their sense of customer concerns about placing mission-critical applications in the cloud matched with the hesitant reflected in the survey data.

In addition to gauging their sense of the climate for mission-critical applications running on public cloud resources, we also asked a secondary question—“is it a ‘tough sell’ for customers to put business critical applications on such resources and when it is not, what is the motivating factor?”

To provide some depth to the issue of the viability of mission-critical applications for public clouds (and what does eventually tip the scale for some companies to make that decision) we gathered opinions from Lars Malmqvist, CTO and Director of Arcus Global Ltd.; Sonal Goyal, CTO/CEO Nube Technologies; Paul Duval, CTO at Stelligent; Glenn Weinstein, CTO at Appirio, and Bernard Golden from HyperStratus.

We’ll start with sentiments from a company that has experience dealing with public sector clients, Arcus Global Ltd.

Lars Malmqvist serves as Director and CTO at Arcus Global Ltd., a company that deals specifically with the needs of public sector clients in the UK. The company supports pilots, migration, development and planning for cloud computing projects for large government organizations. This public sector focus made the company a natural choice for the question of whether or not the concerns outweigh the benefits for core applications on public cloud resources since governments everywhere are approaching the concept of clouds with caution.

Lars provided a unique perspective as well because in his experience, it is difficult to keep pace with the demand to put mission-critical applications in the cloud. Lars writes…

“At Arcus we work exclusively with public sector clients. If you’ve been anywhere near government ICT recently, you’ll know that cloud comes up in just about any conversation you have. Different groups respond differently to it: the managers love it for the cost savings, technical people tend to find it interesting and a bit threatening, while the security people really don’t seem to like it much at all.

That being said, on a day-to-day level far from being a tough sell, the appetite our clients have for putting systems and applications on a public cloud infrastructure far outstrips our ability to actually deliver it in practice. We literally have organisations that would be willing to move their entire core infrastructure to the public cloud tomorrow if we could solve the technical, legal, and security challenges.

The constraints are well known and mainly revolve around security and compliance. Simply put for some categories of data we simply don’t know what the compliance requirements are for putting data on the public cloud. Best practice and guidance has yet to mature and laws always lag behind technology.

In the UK, the biggest challenge at the moment is around IL3 (Impact Level 3) data, which to put it crudely is data that is sensitive enough to really mess someone’s life up or to cause significant disruption to public services.

The existing security guidance simply doesn’t map onto a cloud infrastructure in a neat way. Therefore moving something like a system supporting adult social services to the public cloud would require the organization to make an independent risk assessment and be willing to stand by it in the face of external scrutiny. Few organizations in the public sector are quite that risk tolerant.

That being said government bodies across the world are working on resolving such issues. The pressure is on to cut costs and everyone in government ICT seems to be looking at the cloud to deliver them.

My expectation would be that within 12 to 18 months these issues will be resolved and clear guidance will be given from central government bodies and their regional equivalents on how to proceed even with highly sensitive data. When that is in place I would expect a mass exodus of in-house systems business critical or not from at least local and regional government.”

Sonal Goyal is CTO and CEO at Nube Technologies, a provider of cloud solutions for large-scale analytics and big data problems. Nube’s HIHO, a Hadoop connector for databases and data sources, is an innovative framework that allows customers to move data to and from Hadoop clusters. The company is focused on data mining and analytics using Elastic MapReduce, Hive, Cassandra and related tools–the solutions behind handling both structured and unstructured data at the large scale. Of the viability of public cloud resources for complex mission-critical applications, Sonal Goyal writes:

“I strongly believe that public cloud usage will grow phenomenally for mission critical business applications and data. The two main concerns organizations have about moving critical pieces to the cloud are security and vendor lock-in.

Companies have been cautious about moving sensitive data to a public cloud for fear of information security–data being used by unauthorized channels. Data governance and ability to audit and monitor data are also genuine concerns. Organizations have been worried that cloud providers like Amazon, GoGrid, Rackspace, Google, Microsoft etc who share their infrastructure offer little support in this direction. The second concern is vendor lock-in. Current cloud providers do not offer a unified approach to seamlessly use their services across any provider. Organizations would like to safeguard against this, they would like the flexibility of being able to move critical applications from one cloud to another.  

I believe that these concerns, though valid, will slowly alleviate themselves. These are the same concerns companies had over outsourcing, but we now outsource payroll and legal, key business processes, even medical transcription across countries. Cloud providers are increasingly providing offerings for virtual private clouds and reserved infrastructure, such that organizations do not need to share if they don’t want to. Encryption, password less logins, firewalls etc offered currently already offer some levels of data security.

Recent technical innovations like SecureCloud from TrendMicro and CipherCloud are first steps in making things better. On the interoperability issue, NIST is already working on the SAJACC project and things should get addressed soon. On the API level, there are efforts like DeltaCloud to make things easier.

The needs of the business, the agility required by the market, the ever exploding data and need for more and more capacity will drive this change. As businesses grow, they will have to rely more and more on public clouds. Organizations cannot afford to make massive upfront investments on infrastructure and support personnel. The pay as you go model offered by cloud providers will soon be rampant, and force businesses to re evaluate their key components and move them to the cloud. They will demand the cloud providers  better security standards and uniformity, and they will get it.”

Glenn Weinstein is CTO at Appirio, a cloud solutions company that delivers both projects and professional services to customers with mission-critical needs. Glenn writes:

“We are definitely seeing large enterprises moving widespread mission- and business-critical operations to the public cloud.  It’s not necessarily a tough sell, particularly to CIOs who have already recognized the value of looking first to public cloud solutions to emerging business problems.

 By moving applications to the public cloud, enterprises delegate significant portions of many non-business specific concerns, including scalability, performance, security, deployment, failover, backup, load balancing and interoperability to large firms specializing in technology.

This frees up IT resources to focus nearly all their time and energy on using that technology to solve business problems. In this way, public cloud computing finally offers a solution to the long-standing dilemma of IT spending upwards of 70 percent of its budget on routine maintenance and operations. Shifting to the public cloud allows CIOs to flip this ratio and spend 70 percent or more on business analysis and process improvement.

As public cloud leaders like Salesforce.com and Google Apps gain widespread acceptance and experience rapid customer growth, more technology professionals and CIOs are experiencing the benefits to IT first-hand, lending credibility to public cloud claims about speeding up development processes and lowering costs. With a taste of this success, they are anxious to push additional projects into the cloud, at the same time that the vendors are greatly expanding their platform-as-a-service (PaaS) offerings.  We expect this growth to accelerate as CIOs recognize not only the total cost benefits, but also the speed-to-market improvements.”

Paul Duvall is CTO at Stelligent, which provides “Continuous Delivery Services-Continuous Delivery Operations Centers for large companies using cloud computing resources.” They have experience handling cloud implementations on Amazon’s servers and have worked with a number of customers to get their applications running in a public cloud environment. Paul writes:

“Our customers (health care, financial, real estate) typically employ a hybrid model by using the public cloud for their numerous non-production environments and a private cloud, or the traditionally hosted approach, for production systems. Considering this hybrid approach was barely a consideration for our customers just a few years ago, I see the trend toward moving systems to a public cloud continuing to gain speed.

Notably, we’ve found that automation is the key to getting the most out of moving to the cloud for mission-critical systems. For example, if you need to install database or application containers every time you stand up a new instance, you’re not getting the kinds of productivity gains that you’d achieve by automating the environment instantiation. Automating the provisioning and deployment provides the organization enormous flexibility to release their software wherever and whenever they choose.

Some customers are concerned about data security, etc. and whether a public cloud provider increases their vulnerability. If the customer’s concern is simply a lack of trust that the public cloud vendor will keep their data safe, we illustrate the various security processes and mechanisms applied by the public cloud vendor, and suggest applying appropriate application security techniques through encryption as they might normally do in any system that has identifiable information.

The infamous quote “trust, but verify” is quite applicable to the increasing trend of companies moving their mission-critical systems from internally-hosted infrastructures to a public cloud. It’s shocking that some large organizations apply implicit trust to their own operations teams who manage their systems. Yet, if many of these organizations performed an internal security audit, the more respected public cloud vendors would win “hands down” in terms of processes, security accreditation, etc. every single time.”

Bernard Golden leads HyperStratus, a company that helps organizations take advantage of cloud architectures via their advice about infrastructure, provider, application and other choices customers must make. Given his experience working with enterprise customers at every stage of the cloud deployment process, he has had time to form some firm opinions about the viability of public clouds for mission-critical applications. Golden states:

“Many organizations have reservations about putting critical business applications in the cloud. The primary concern that is raised about public cloud computing is security, although it often turns out that the term security is used, but the concern actually focuses on compliance or risk exposure. Our belief and experience is that public cloud computing is viable today for many mission-critical applications.

The primary motivation for application groups to embrace a public cloud alternative is dissatisfaction with the current internal data center offering, for reasons of lack of responsiveness or cost. One client of ours, a Fortune 500 company in the information services industry, considered the corporate data center, but decided to pursue a public cloud option because it would reduce their cost on the order of 75%. Nonetheless, convincing large organizations to use a public cloud infrastructure is often difficult and many are not yet ready to pursue such a choice.

Our expectation is that use of public cloud computing by large organizations will gradually increase as they become more familiar and comfortable with that decision. A galvanizing event for making such a decision is to see a peer organization succeed with a similar application that is being considered for placement in the public cloud.”

This concludes our round of gathered views on the subject but we’d like your input. Whether you’re a cloud vendor or end user weighing the benefits versus risks of public cloud resource there are very likely at least a few elements of the ideas presented here that you agree or disagree with. Are public clouds ready for the responsibility and do you feel that the time is right to place trust in the clouds?

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

TACC Researchers Test AI Traffic Monitoring Tool in Austin

December 13, 2017

Traffic jams and mishaps are often painful and sometimes dangerous facts of life. At this week’s IEEE International Conference on Big Data being held in Boston, researchers from TACC and colleagues will present a new Read more…

AMD Wins Another: Baidu to Deploy EPYC on Single Socket Servers

December 13, 2017

When AMD introduced its EPYC chip line in June, the company said a portion of the line was specifically designed to re-invigorate a single socket segment in what has become an overwhelmingly two-socket landscape in the d Read more…

By John Russell

Microsoft Wants to Speed Quantum Development

December 12, 2017

Quantum computing continues to make headlines in what remains of 2017 as tech giants jockey to establish a pole position in the race toward commercialization of quantum. This week, Microsoft took the next step in advanci Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

Explore the Origins of Space with COSMOS and Memory-Driven Computing

From the formation of black holes to the origins of space, data is the key to unlocking the secrets of the early universe. Read more…

ESnet Now Moving More Than 1 Petabyte/wk

December 12, 2017

Optimizing ESnet (Energy Sciences Network), the world's fastest network for science, is an ongoing process. Recently a two-year collaboration by ESnet users – the Petascale DTN Project – achieved its ambitious goal t Read more…

AMD Wins Another: Baidu to Deploy EPYC on Single Socket Servers

December 13, 2017

When AMD introduced its EPYC chip line in June, the company said a portion of the line was specifically designed to re-invigorate a single socket segment in wha Read more…

By John Russell

Microsoft Wants to Speed Quantum Development

December 12, 2017

Quantum computing continues to make headlines in what remains of 2017 as tech giants jockey to establish a pole position in the race toward commercialization of Read more…

By Tiffany Trader

HPC Iron, Soft, Data, People – It Takes an Ecosystem!

December 11, 2017

Cutting edge advanced computing hardware (aka big iron) does not stand by itself. These computers are the pinnacle of a myriad of technologies that must be care Read more…

By Alex R. Larzelere

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Microsoft Spins Cycle Computing into Core Azure Product

December 5, 2017

Last August, cloud giant Microsoft acquired HPC cloud orchestration pioneer Cycle Computing. Since then the focus has been on integrating Cycle’s organization Read more…

By John Russell

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

HPE In-Memory Platform Comes to COSMOS

November 30, 2017

Hewlett Packard Enterprise is on a mission to accelerate space research. In August, it sent the first commercial-off-the-shelf HPC system into space for testing Read more…

By Tiffany Trader

SC17 Cluster Competition: Who Won and Why? Results Analyzed and Over-Analyzed

November 28, 2017

Everyone by now knows that Nanyang Technological University of Singapore (NTU) took home the highest LINPACK Award and the Overall Championship from the recently concluded SC17 Student Cluster Competition. We also already know how the teams did in the Highest LINPACK and Highest HPCG competitions, with Nanyang grabbing bragging rights for both benchmarks. Read more…

By Dan Olds

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

NERSC Scales Scientific Deep Learning to 15 Petaflops

August 28, 2017

A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according Read more…

By Rob Farber

Oracle Layoffs Reportedly Hit SPARC and Solaris Hard

September 7, 2017

Oracle’s latest layoffs have many wondering if this is the end of the line for the SPARC processor and Solaris OS development. As reported by multiple sources Read more…

By John Russell

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

GlobalFoundries Puts Wind in AMD’s Sails with 12nm FinFET

September 24, 2017

From its annual tech conference last week (Sept. 20), where GlobalFoundries welcomed more than 600 semiconductor professionals (reaching the Santa Clara venue Read more…

By Tiffany Trader

Google Releases Deeplearn.js to Further Democratize Machine Learning

August 17, 2017

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last w Read more…

By John Russell

Leading Solution Providers

Amazon Debuts New AMD-based GPU Instances for Graphics Acceleration

September 12, 2017

Last week Amazon Web Services (AWS) streaming service, AppStream 2.0, introduced a new GPU instance called Graphics Design intended to accelerate graphics. The Read more…

By John Russell

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

Intel Launches Software Tools to Ease FPGA Programming

September 5, 2017

Field Programmable Gate Arrays (FPGAs) have a reputation for being difficult to program, requiring expertise in specialty languages, like Verilog or VHDL. Easin Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This