CTO Panel: Are Public Clouds Ripe for Mission Critical Applications?

By Nicole Hemsoth

February 15, 2011

This week we gathered the opinions of five technical leaders at cloud service companies to gauge their views on customer reception of the idea of placing mission-critical applications on public cloud resources. Chief Technical Officers from smaller public cloud-focused companies, including Stelligent, Hyperstratus, Appirio, Arcus Global,and Nube Technologies, weighed in on their sense of customer acceptance of putting core applications in the cloud.

Just as important as the initial question about viability is a secondary query—for those that did decide to send mission-critical apps to the public cloud, what was the driving factor?

A number of surveys have been conducted over the course of the past year to gauge general sentiments about placing business-critical or mission-critical applications in the cloud. More specifically, on a public cloud resource such as that offered by Amazon Web Services.

Although survey data varies according to the respondent base, the consensus seems to be that there is still quite a bit of hesitancy to place mission-critical applications in an environment where there is not a complete sense of control—not to mention concerns about data protection and location, compliance and regulatory risks, fear of lock-in…the list tends to go on.

One recent survey conducted by ESG Research found that of the 600 American and European IT professionals questioned, 42% said that public clouds would not enter into their business models in the next five years. Among the top reasons listed were, perhaps not surprisingly, data and privacy concerns (43%), loss of control (32%), existing investments in current infrastructure (also at 32%), the need to feel that the cloud ecosystem is mature before diving in (29%) and 28% responded that were satisfied with their current infrastructure currently.

While conversations with enterprise IT leaders often follow this same trajectory in terms of response, the time seemed ripe to check in with technical leaders at a number of cloud services companies to see if their sense of customer concerns about placing mission-critical applications in the cloud matched with the hesitant reflected in the survey data.

In addition to gauging their sense of the climate for mission-critical applications running on public cloud resources, we also asked a secondary question—“is it a ‘tough sell’ for customers to put business critical applications on such resources and when it is not, what is the motivating factor?”

To provide some depth to the issue of the viability of mission-critical applications for public clouds (and what does eventually tip the scale for some companies to make that decision) we gathered opinions from Lars Malmqvist, CTO and Director of Arcus Global Ltd.; Sonal Goyal, CTO/CEO Nube Technologies; Paul Duval, CTO at Stelligent; Glenn Weinstein, CTO at Appirio, and Bernard Golden from HyperStratus.

We’ll start with sentiments from a company that has experience dealing with public sector clients, Arcus Global Ltd.

Lars Malmqvist serves as Director and CTO at Arcus Global Ltd., a company that deals specifically with the needs of public sector clients in the UK. The company supports pilots, migration, development and planning for cloud computing projects for large government organizations. This public sector focus made the company a natural choice for the question of whether or not the concerns outweigh the benefits for core applications on public cloud resources since governments everywhere are approaching the concept of clouds with caution.

Lars provided a unique perspective as well because in his experience, it is difficult to keep pace with the demand to put mission-critical applications in the cloud. Lars writes…

“At Arcus we work exclusively with public sector clients. If you’ve been anywhere near government ICT recently, you’ll know that cloud comes up in just about any conversation you have. Different groups respond differently to it: the managers love it for the cost savings, technical people tend to find it interesting and a bit threatening, while the security people really don’t seem to like it much at all.

That being said, on a day-to-day level far from being a tough sell, the appetite our clients have for putting systems and applications on a public cloud infrastructure far outstrips our ability to actually deliver it in practice. We literally have organisations that would be willing to move their entire core infrastructure to the public cloud tomorrow if we could solve the technical, legal, and security challenges.

The constraints are well known and mainly revolve around security and compliance. Simply put for some categories of data we simply don’t know what the compliance requirements are for putting data on the public cloud. Best practice and guidance has yet to mature and laws always lag behind technology.

In the UK, the biggest challenge at the moment is around IL3 (Impact Level 3) data, which to put it crudely is data that is sensitive enough to really mess someone’s life up or to cause significant disruption to public services.

The existing security guidance simply doesn’t map onto a cloud infrastructure in a neat way. Therefore moving something like a system supporting adult social services to the public cloud would require the organization to make an independent risk assessment and be willing to stand by it in the face of external scrutiny. Few organizations in the public sector are quite that risk tolerant.

That being said government bodies across the world are working on resolving such issues. The pressure is on to cut costs and everyone in government ICT seems to be looking at the cloud to deliver them.

My expectation would be that within 12 to 18 months these issues will be resolved and clear guidance will be given from central government bodies and their regional equivalents on how to proceed even with highly sensitive data. When that is in place I would expect a mass exodus of in-house systems business critical or not from at least local and regional government.”

Sonal Goyal is CTO and CEO at Nube Technologies, a provider of cloud solutions for large-scale analytics and big data problems. Nube’s HIHO, a Hadoop connector for databases and data sources, is an innovative framework that allows customers to move data to and from Hadoop clusters. The company is focused on data mining and analytics using Elastic MapReduce, Hive, Cassandra and related tools–the solutions behind handling both structured and unstructured data at the large scale. Of the viability of public cloud resources for complex mission-critical applications, Sonal Goyal writes:

“I strongly believe that public cloud usage will grow phenomenally for mission critical business applications and data. The two main concerns organizations have about moving critical pieces to the cloud are security and vendor lock-in.

Companies have been cautious about moving sensitive data to a public cloud for fear of information security–data being used by unauthorized channels. Data governance and ability to audit and monitor data are also genuine concerns. Organizations have been worried that cloud providers like Amazon, GoGrid, Rackspace, Google, Microsoft etc who share their infrastructure offer little support in this direction. The second concern is vendor lock-in. Current cloud providers do not offer a unified approach to seamlessly use their services across any provider. Organizations would like to safeguard against this, they would like the flexibility of being able to move critical applications from one cloud to another.  

I believe that these concerns, though valid, will slowly alleviate themselves. These are the same concerns companies had over outsourcing, but we now outsource payroll and legal, key business processes, even medical transcription across countries. Cloud providers are increasingly providing offerings for virtual private clouds and reserved infrastructure, such that organizations do not need to share if they don’t want to. Encryption, password less logins, firewalls etc offered currently already offer some levels of data security.

Recent technical innovations like SecureCloud from TrendMicro and CipherCloud are first steps in making things better. On the interoperability issue, NIST is already working on the SAJACC project and things should get addressed soon. On the API level, there are efforts like DeltaCloud to make things easier.

The needs of the business, the agility required by the market, the ever exploding data and need for more and more capacity will drive this change. As businesses grow, they will have to rely more and more on public clouds. Organizations cannot afford to make massive upfront investments on infrastructure and support personnel. The pay as you go model offered by cloud providers will soon be rampant, and force businesses to re evaluate their key components and move them to the cloud. They will demand the cloud providers  better security standards and uniformity, and they will get it.”

Glenn Weinstein is CTO at Appirio, a cloud solutions company that delivers both projects and professional services to customers with mission-critical needs. Glenn writes:

“We are definitely seeing large enterprises moving widespread mission- and business-critical operations to the public cloud.  It’s not necessarily a tough sell, particularly to CIOs who have already recognized the value of looking first to public cloud solutions to emerging business problems.

 By moving applications to the public cloud, enterprises delegate significant portions of many non-business specific concerns, including scalability, performance, security, deployment, failover, backup, load balancing and interoperability to large firms specializing in technology.

This frees up IT resources to focus nearly all their time and energy on using that technology to solve business problems. In this way, public cloud computing finally offers a solution to the long-standing dilemma of IT spending upwards of 70 percent of its budget on routine maintenance and operations. Shifting to the public cloud allows CIOs to flip this ratio and spend 70 percent or more on business analysis and process improvement.

As public cloud leaders like Salesforce.com and Google Apps gain widespread acceptance and experience rapid customer growth, more technology professionals and CIOs are experiencing the benefits to IT first-hand, lending credibility to public cloud claims about speeding up development processes and lowering costs. With a taste of this success, they are anxious to push additional projects into the cloud, at the same time that the vendors are greatly expanding their platform-as-a-service (PaaS) offerings.  We expect this growth to accelerate as CIOs recognize not only the total cost benefits, but also the speed-to-market improvements.”

Paul Duvall is CTO at Stelligent, which provides “Continuous Delivery Services-Continuous Delivery Operations Centers for large companies using cloud computing resources.” They have experience handling cloud implementations on Amazon’s servers and have worked with a number of customers to get their applications running in a public cloud environment. Paul writes:

“Our customers (health care, financial, real estate) typically employ a hybrid model by using the public cloud for their numerous non-production environments and a private cloud, or the traditionally hosted approach, for production systems. Considering this hybrid approach was barely a consideration for our customers just a few years ago, I see the trend toward moving systems to a public cloud continuing to gain speed.

Notably, we’ve found that automation is the key to getting the most out of moving to the cloud for mission-critical systems. For example, if you need to install database or application containers every time you stand up a new instance, you’re not getting the kinds of productivity gains that you’d achieve by automating the environment instantiation. Automating the provisioning and deployment provides the organization enormous flexibility to release their software wherever and whenever they choose.

Some customers are concerned about data security, etc. and whether a public cloud provider increases their vulnerability. If the customer’s concern is simply a lack of trust that the public cloud vendor will keep their data safe, we illustrate the various security processes and mechanisms applied by the public cloud vendor, and suggest applying appropriate application security techniques through encryption as they might normally do in any system that has identifiable information.

The infamous quote “trust, but verify” is quite applicable to the increasing trend of companies moving their mission-critical systems from internally-hosted infrastructures to a public cloud. It’s shocking that some large organizations apply implicit trust to their own operations teams who manage their systems. Yet, if many of these organizations performed an internal security audit, the more respected public cloud vendors would win “hands down” in terms of processes, security accreditation, etc. every single time.”

Bernard Golden leads HyperStratus, a company that helps organizations take advantage of cloud architectures via their advice about infrastructure, provider, application and other choices customers must make. Given his experience working with enterprise customers at every stage of the cloud deployment process, he has had time to form some firm opinions about the viability of public clouds for mission-critical applications. Golden states:

“Many organizations have reservations about putting critical business applications in the cloud. The primary concern that is raised about public cloud computing is security, although it often turns out that the term security is used, but the concern actually focuses on compliance or risk exposure. Our belief and experience is that public cloud computing is viable today for many mission-critical applications.

The primary motivation for application groups to embrace a public cloud alternative is dissatisfaction with the current internal data center offering, for reasons of lack of responsiveness or cost. One client of ours, a Fortune 500 company in the information services industry, considered the corporate data center, but decided to pursue a public cloud option because it would reduce their cost on the order of 75%. Nonetheless, convincing large organizations to use a public cloud infrastructure is often difficult and many are not yet ready to pursue such a choice.

Our expectation is that use of public cloud computing by large organizations will gradually increase as they become more familiar and comfortable with that decision. A galvanizing event for making such a decision is to see a peer organization succeed with a similar application that is being considered for placement in the public cloud.”

This concludes our round of gathered views on the subject but we’d like your input. Whether you’re a cloud vendor or end user weighing the benefits versus risks of public cloud resource there are very likely at least a few elements of the ideas presented here that you agree or disagree with. Are public clouds ready for the responsibility and do you feel that the time is right to place trust in the clouds?

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

ISC18’s Industrial Day Slate: Digital Twins, CFD for Automotive, HPC for SMEs

June 23, 2018

For enterprise IT strategists, this year’s Industrial Day (Tuesday, June 26) at ISC18 in Frankfurt will cover a range of topics – digital twins, AI and machine learning in automotive design, HPC for SME’s and deve Read more…

By Doug Black

What’s Hot and What’s Not at ISC 2018?

June 22, 2018

As the calendar rolls around to late June we see the ISC conference, held in Frankfurt (June 24th-28th), heave into view. With some of the pre-show announcements already starting to roll out, what do we think some of the Read more…

By Dairsie Latimer

Servers in Orbit, HPE Apollos Make 4,500 Trips Around Earth

June 22, 2018

The International Space Station shines a little brighter in the night sky thanks to what amounts to an orbiting supercomputer lofted to the outpost last year as part of a year-long experiment to determine if high-end com Read more…

By George Leopold

HPE Extreme Performance Solutions

HPC and AI Convergence is Accelerating New Levels of Intelligence

Data analytics is the most valuable tool in the digital marketplace – so much so that organizations are employing high performance computing (HPC) capabilities to rapidly collect, share, and analyze endless streams of data. Read more…

IBM Accelerated Insights

Taking the AI Training Wheels Off: From PoC to Production

Even though it seems simple now, there were a lot of skills to master in learning to ride a bike. From balancing on two wheels, and steering in a straight line, to going around corners and stopping before running over the dog, it took lots of practice to master these skills. Read more…

HPCwire Readers’ and Editors’ Choice Awards Turns 15

June 22, 2018

A hallmark of sustainability is this: If you are not serving a need effectively and efficiently you do not last. The HPCwire Readers’ and Editors’ Choice awards program has stood the test of time. Each year, our read Read more…

By Tiffany Trader

What’s Hot and What’s Not at ISC 2018?

June 22, 2018

As the calendar rolls around to late June we see the ISC conference, held in Frankfurt (June 24th-28th), heave into view. With some of the pre-show announcement Read more…

By Dairsie Latimer

Servers in Orbit, HPE Apollos Make 4,500 Trips Around Earth

June 22, 2018

The International Space Station shines a little brighter in the night sky thanks to what amounts to an orbiting supercomputer lofted to the outpost last year as Read more…

By George Leopold

HPCwire Readers’ and Editors’ Choice Awards Turns 15

June 22, 2018

A hallmark of sustainability is this: If you are not serving a need effectively and efficiently you do not last. The HPCwire Readers’ and Editors’ Choice aw Read more…

By Tiffany Trader

ISC 2018 Preview from @hpcnotes

June 21, 2018

Prepare for your social media feed to be saturated with #HPC, #ISC18, #Top500, etc. Prepare for your mainstream media to talk about supercomputers (in between t Read more…

By Andrew Jones

AMD’s EPYC Road to Redemption in Six Slides

June 21, 2018

A year ago AMD returned to the server market with its EPYC processor line. The earth didn’t tremble but folks took notice. People remember the Opteron fondly Read more…

By John Russell

European HPC Summit Week and PRACEdays 2018: Slaying Dragons and SHAPEing Futures One SME at a Time

June 20, 2018

The University of Ljubljana in Slovenia hosted the third annual EHPCSW18 and fifth annual PRACEdays18 events which opened May 29, 2018. The conference was chair Read more…

By Elizabeth Leake (STEM-Trek for HPCwire)

Cray Introduces All Flash Lustre Storage Solution Targeting HPC

June 19, 2018

Citing the rise of IOPS-intensive workflows and more affordable flash technology, Cray today introduced the L300F, a scalable all-flash storage solution whose p Read more…

By John Russell

Sandia to Take Delivery of World’s Largest Arm System

June 18, 2018

While the enterprise remains circumspect on prospects for Arm servers in the datacenter, the leadership HPC community is taking a bolder, brighter view of the x86 server CPU alternative. Amongst current and planned Arm HPC installations – i.e., the innovative Mont-Blanc project, led by Bull/Atos, the 'Isambard’ Cray XC50 going into the University of Bristol, and commitments from both Japan and France among others -- HPE is announcing that it will be supply the United States National Nuclear Security Administration (NNSA) with a 2.3 petaflops peak Arm-based system, named Astra. Read more…

By Tiffany Trader

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

ORNL Summit Supercomputer Is Officially Here

June 8, 2018

Oak Ridge National Laboratory (ORNL) together with IBM and Nvidia celebrated the official unveiling of the Department of Energy (DOE) Summit supercomputer toda Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Hennessy & Patterson: A New Golden Age for Computer Architecture

April 17, 2018

On Monday June 4, 2018, 2017 A.M. Turing Award Winners John L. Hennessy and David A. Patterson will deliver the Turing Lecture at the 45th International Sympo Read more…

By Staff

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

Google I/O 2018: AI Everywhere; TPU 3.0 Delivers 100+ Petaflops but Requires Liquid Cooling

May 9, 2018

All things AI dominated discussion at yesterday’s opening of Google’s I/O 2018 developers meeting covering much of Google's near-term product roadmap. The e Read more…

By John Russell

Pattern Computer – Startup Claims Breakthrough in ‘Pattern Discovery’ Technology

May 23, 2018

If it weren’t for the heavy-hitter technology team behind start-up Pattern Computer, which emerged from stealth today in a live-streamed event from San Franci Read more…

By John Russell

Nvidia Ups Hardware Game with 16-GPU DGX-2 Server and 18-Port NVSwitch

March 27, 2018

Nvidia unveiled a raft of new products from its annual technology conference in San Jose today, and despite not offering up a new chip architecture, there were still a few surprises in store for HPC hardware aficionados. Read more…

By Tiffany Trader

Sandia to Take Delivery of World’s Largest Arm System

June 18, 2018

While the enterprise remains circumspect on prospects for Arm servers in the datacenter, the leadership HPC community is taking a bolder, brighter view of the x86 server CPU alternative. Amongst current and planned Arm HPC installations – i.e., the innovative Mont-Blanc project, led by Bull/Atos, the 'Isambard’ Cray XC50 going into the University of Bristol, and commitments from both Japan and France among others -- HPE is announcing that it will be supply the United States National Nuclear Security Administration (NNSA) with a 2.3 petaflops peak Arm-based system, named Astra. Read more…

By Tiffany Trader

AMD’s EPYC Road to Redemption in Six Slides

June 21, 2018

A year ago AMD returned to the server market with its EPYC processor line. The earth didn’t tremble but folks took notice. People remember the Opteron fondly Read more…

By John Russell

Part One: Deep Dive into 2018 Trends in Life Sciences HPC

March 1, 2018

Life sciences is an interesting lens through which to see HPC. It is perhaps not an obvious choice, given life sciences’ relative newness as a heavy user of H Read more…

By John Russell

Intel Pledges First Commercial Nervana Product ‘Spring Crest’ in 2019

May 24, 2018

At its AI developer conference in San Francisco yesterday, Intel embraced a holistic approach to AI and showed off a broad AI portfolio that includes Xeon processors, Movidius technologies, FPGAs and Intel’s Nervana Neural Network Processors (NNPs), based on the technology it acquired in 2016. Read more…

By Tiffany Trader

Google Charts Two-Dimensional Quantum Course

April 26, 2018

Quantum error correction, essential for achieving universal fault-tolerant quantum computation, is one of the main challenges of the quantum computing field and it’s top of mind for Google’s John Martinis. At a presentation last week at the HPC User Forum in Tucson, Martinis, one of the world's foremost experts in quantum computing, emphasized... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This