CTO Panel: Are Public Clouds Ripe for Mission Critical Applications?

By Nicole Hemsoth

February 15, 2011

This week we gathered the opinions of five technical leaders at cloud service companies to gauge their views on customer reception of the idea of placing mission-critical applications on public cloud resources. Chief Technical Officers from smaller public cloud-focused companies, including Stelligent, Hyperstratus, Appirio, Arcus Global,and Nube Technologies, weighed in on their sense of customer acceptance of putting core applications in the cloud.

Just as important as the initial question about viability is a secondary query—for those that did decide to send mission-critical apps to the public cloud, what was the driving factor?

A number of surveys have been conducted over the course of the past year to gauge general sentiments about placing business-critical or mission-critical applications in the cloud. More specifically, on a public cloud resource such as that offered by Amazon Web Services.

Although survey data varies according to the respondent base, the consensus seems to be that there is still quite a bit of hesitancy to place mission-critical applications in an environment where there is not a complete sense of control—not to mention concerns about data protection and location, compliance and regulatory risks, fear of lock-in…the list tends to go on.

One recent survey conducted by ESG Research found that of the 600 American and European IT professionals questioned, 42% said that public clouds would not enter into their business models in the next five years. Among the top reasons listed were, perhaps not surprisingly, data and privacy concerns (43%), loss of control (32%), existing investments in current infrastructure (also at 32%), the need to feel that the cloud ecosystem is mature before diving in (29%) and 28% responded that were satisfied with their current infrastructure currently.

While conversations with enterprise IT leaders often follow this same trajectory in terms of response, the time seemed ripe to check in with technical leaders at a number of cloud services companies to see if their sense of customer concerns about placing mission-critical applications in the cloud matched with the hesitant reflected in the survey data.

In addition to gauging their sense of the climate for mission-critical applications running on public cloud resources, we also asked a secondary question—“is it a ‘tough sell’ for customers to put business critical applications on such resources and when it is not, what is the motivating factor?”

To provide some depth to the issue of the viability of mission-critical applications for public clouds (and what does eventually tip the scale for some companies to make that decision) we gathered opinions from Lars Malmqvist, CTO and Director of Arcus Global Ltd.; Sonal Goyal, CTO/CEO Nube Technologies; Paul Duval, CTO at Stelligent; Glenn Weinstein, CTO at Appirio, and Bernard Golden from HyperStratus.

We’ll start with sentiments from a company that has experience dealing with public sector clients, Arcus Global Ltd.

Lars Malmqvist serves as Director and CTO at Arcus Global Ltd., a company that deals specifically with the needs of public sector clients in the UK. The company supports pilots, migration, development and planning for cloud computing projects for large government organizations. This public sector focus made the company a natural choice for the question of whether or not the concerns outweigh the benefits for core applications on public cloud resources since governments everywhere are approaching the concept of clouds with caution.

Lars provided a unique perspective as well because in his experience, it is difficult to keep pace with the demand to put mission-critical applications in the cloud. Lars writes…

“At Arcus we work exclusively with public sector clients. If you’ve been anywhere near government ICT recently, you’ll know that cloud comes up in just about any conversation you have. Different groups respond differently to it: the managers love it for the cost savings, technical people tend to find it interesting and a bit threatening, while the security people really don’t seem to like it much at all.

That being said, on a day-to-day level far from being a tough sell, the appetite our clients have for putting systems and applications on a public cloud infrastructure far outstrips our ability to actually deliver it in practice. We literally have organisations that would be willing to move their entire core infrastructure to the public cloud tomorrow if we could solve the technical, legal, and security challenges.

The constraints are well known and mainly revolve around security and compliance. Simply put for some categories of data we simply don’t know what the compliance requirements are for putting data on the public cloud. Best practice and guidance has yet to mature and laws always lag behind technology.

In the UK, the biggest challenge at the moment is around IL3 (Impact Level 3) data, which to put it crudely is data that is sensitive enough to really mess someone’s life up or to cause significant disruption to public services.

The existing security guidance simply doesn’t map onto a cloud infrastructure in a neat way. Therefore moving something like a system supporting adult social services to the public cloud would require the organization to make an independent risk assessment and be willing to stand by it in the face of external scrutiny. Few organizations in the public sector are quite that risk tolerant.

That being said government bodies across the world are working on resolving such issues. The pressure is on to cut costs and everyone in government ICT seems to be looking at the cloud to deliver them.

My expectation would be that within 12 to 18 months these issues will be resolved and clear guidance will be given from central government bodies and their regional equivalents on how to proceed even with highly sensitive data. When that is in place I would expect a mass exodus of in-house systems business critical or not from at least local and regional government.”

Sonal Goyal is CTO and CEO at Nube Technologies, a provider of cloud solutions for large-scale analytics and big data problems. Nube’s HIHO, a Hadoop connector for databases and data sources, is an innovative framework that allows customers to move data to and from Hadoop clusters. The company is focused on data mining and analytics using Elastic MapReduce, Hive, Cassandra and related tools–the solutions behind handling both structured and unstructured data at the large scale. Of the viability of public cloud resources for complex mission-critical applications, Sonal Goyal writes:

“I strongly believe that public cloud usage will grow phenomenally for mission critical business applications and data. The two main concerns organizations have about moving critical pieces to the cloud are security and vendor lock-in.

Companies have been cautious about moving sensitive data to a public cloud for fear of information security–data being used by unauthorized channels. Data governance and ability to audit and monitor data are also genuine concerns. Organizations have been worried that cloud providers like Amazon, GoGrid, Rackspace, Google, Microsoft etc who share their infrastructure offer little support in this direction. The second concern is vendor lock-in. Current cloud providers do not offer a unified approach to seamlessly use their services across any provider. Organizations would like to safeguard against this, they would like the flexibility of being able to move critical applications from one cloud to another.  

I believe that these concerns, though valid, will slowly alleviate themselves. These are the same concerns companies had over outsourcing, but we now outsource payroll and legal, key business processes, even medical transcription across countries. Cloud providers are increasingly providing offerings for virtual private clouds and reserved infrastructure, such that organizations do not need to share if they don’t want to. Encryption, password less logins, firewalls etc offered currently already offer some levels of data security.

Recent technical innovations like SecureCloud from TrendMicro and CipherCloud are first steps in making things better. On the interoperability issue, NIST is already working on the SAJACC project and things should get addressed soon. On the API level, there are efforts like DeltaCloud to make things easier.

The needs of the business, the agility required by the market, the ever exploding data and need for more and more capacity will drive this change. As businesses grow, they will have to rely more and more on public clouds. Organizations cannot afford to make massive upfront investments on infrastructure and support personnel. The pay as you go model offered by cloud providers will soon be rampant, and force businesses to re evaluate their key components and move them to the cloud. They will demand the cloud providers  better security standards and uniformity, and they will get it.”

Glenn Weinstein is CTO at Appirio, a cloud solutions company that delivers both projects and professional services to customers with mission-critical needs. Glenn writes:

“We are definitely seeing large enterprises moving widespread mission- and business-critical operations to the public cloud.  It’s not necessarily a tough sell, particularly to CIOs who have already recognized the value of looking first to public cloud solutions to emerging business problems.

 By moving applications to the public cloud, enterprises delegate significant portions of many non-business specific concerns, including scalability, performance, security, deployment, failover, backup, load balancing and interoperability to large firms specializing in technology.

This frees up IT resources to focus nearly all their time and energy on using that technology to solve business problems. In this way, public cloud computing finally offers a solution to the long-standing dilemma of IT spending upwards of 70 percent of its budget on routine maintenance and operations. Shifting to the public cloud allows CIOs to flip this ratio and spend 70 percent or more on business analysis and process improvement.

As public cloud leaders like Salesforce.com and Google Apps gain widespread acceptance and experience rapid customer growth, more technology professionals and CIOs are experiencing the benefits to IT first-hand, lending credibility to public cloud claims about speeding up development processes and lowering costs. With a taste of this success, they are anxious to push additional projects into the cloud, at the same time that the vendors are greatly expanding their platform-as-a-service (PaaS) offerings.  We expect this growth to accelerate as CIOs recognize not only the total cost benefits, but also the speed-to-market improvements.”

Paul Duvall is CTO at Stelligent, which provides “Continuous Delivery Services-Continuous Delivery Operations Centers for large companies using cloud computing resources.” They have experience handling cloud implementations on Amazon’s servers and have worked with a number of customers to get their applications running in a public cloud environment. Paul writes:

“Our customers (health care, financial, real estate) typically employ a hybrid model by using the public cloud for their numerous non-production environments and a private cloud, or the traditionally hosted approach, for production systems. Considering this hybrid approach was barely a consideration for our customers just a few years ago, I see the trend toward moving systems to a public cloud continuing to gain speed.

Notably, we’ve found that automation is the key to getting the most out of moving to the cloud for mission-critical systems. For example, if you need to install database or application containers every time you stand up a new instance, you’re not getting the kinds of productivity gains that you’d achieve by automating the environment instantiation. Automating the provisioning and deployment provides the organization enormous flexibility to release their software wherever and whenever they choose.

Some customers are concerned about data security, etc. and whether a public cloud provider increases their vulnerability. If the customer’s concern is simply a lack of trust that the public cloud vendor will keep their data safe, we illustrate the various security processes and mechanisms applied by the public cloud vendor, and suggest applying appropriate application security techniques through encryption as they might normally do in any system that has identifiable information.

The infamous quote “trust, but verify” is quite applicable to the increasing trend of companies moving their mission-critical systems from internally-hosted infrastructures to a public cloud. It’s shocking that some large organizations apply implicit trust to their own operations teams who manage their systems. Yet, if many of these organizations performed an internal security audit, the more respected public cloud vendors would win “hands down” in terms of processes, security accreditation, etc. every single time.”

Bernard Golden leads HyperStratus, a company that helps organizations take advantage of cloud architectures via their advice about infrastructure, provider, application and other choices customers must make. Given his experience working with enterprise customers at every stage of the cloud deployment process, he has had time to form some firm opinions about the viability of public clouds for mission-critical applications. Golden states:

“Many organizations have reservations about putting critical business applications in the cloud. The primary concern that is raised about public cloud computing is security, although it often turns out that the term security is used, but the concern actually focuses on compliance or risk exposure. Our belief and experience is that public cloud computing is viable today for many mission-critical applications.

The primary motivation for application groups to embrace a public cloud alternative is dissatisfaction with the current internal data center offering, for reasons of lack of responsiveness or cost. One client of ours, a Fortune 500 company in the information services industry, considered the corporate data center, but decided to pursue a public cloud option because it would reduce their cost on the order of 75%. Nonetheless, convincing large organizations to use a public cloud infrastructure is often difficult and many are not yet ready to pursue such a choice.

Our expectation is that use of public cloud computing by large organizations will gradually increase as they become more familiar and comfortable with that decision. A galvanizing event for making such a decision is to see a peer organization succeed with a similar application that is being considered for placement in the public cloud.”

This concludes our round of gathered views on the subject but we’d like your input. Whether you’re a cloud vendor or end user weighing the benefits versus risks of public cloud resource there are very likely at least a few elements of the ideas presented here that you agree or disagree with. Are public clouds ready for the responsibility and do you feel that the time is right to place trust in the clouds?

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Russian Supercomputer Employed to Develop COVID-19 Treatment

March 31, 2020

From Summit to [email protected], global supercomputing is continuing to mobilize against the coronavirus pandemic by crunching massive problems like epidemiology, therapeutic development and vaccine development. The latest a Read more…

By Staff report

What’s New in HPC Research: Supersonic Jets, Skin Modeling, Astrophysics & More

March 31, 2020

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Pandemic ‘Wipes Out’ 2020 HPC Market Growth, Flat to 12% Drop Expected

March 31, 2020

As the world battles the still accelerating novel coronavirus, the HPC community has mounted a forceful response to the pandemic on many fronts. But these efforts won't inoculate the HPC industry from the economic effects of COVID-19. Market watcher Intersect360 Research has revised its 2020 forecast for HPC products and services, projecting... Read more…

By Tiffany Trader

LLNL Leverages Supercomputing to Identify COVID-19 Antibody Candidates

March 30, 2020

As COVID-19 sweeps the globe to devastating effect, supercomputers around the world are spinning up to fight back by working on diagnosis, epidemiology, treatment and vaccine development. Now, Lawrence Livermore National Read more…

By Staff report

Weather at Exascale: Load Balancing for Heterogeneous Systems

March 30, 2020

The first months of 2020 were dominated by weather and climate supercomputing news, with major announcements coming from the UK, the European Centre for Medium-Range Weather Forecasts and the U.S. National Oceanic and At Read more…

By Oliver Peckham

AWS Solution Channel

Amazon FSx for Lustre Update: Persistent Storage for Long-Term, High-Performance Workloads

Last year I wrote about Amazon FSx for Lustre and told you how our customers can use it to create pebibyte-scale, highly parallel POSIX-compliant file systems that serve thousands of simultaneous clients driving millions of IOPS (Input/Output Operations per Second) with sub-millisecond latency. Read more…

Q&A Part Two: ORNL’s Pooser on Progress in Quantum Communication

March 30, 2020

Quantum computing seems to get more than its fair share of attention compared to quantum communication. That’s despite the fact that quantum networking may be nearer to becoming a practical reality. In this second inst Read more…

By John Russell

Pandemic ‘Wipes Out’ 2020 HPC Market Growth, Flat to 12% Drop Expected

March 31, 2020

As the world battles the still accelerating novel coronavirus, the HPC community has mounted a forceful response to the pandemic on many fronts. But these efforts won't inoculate the HPC industry from the economic effects of COVID-19. Market watcher Intersect360 Research has revised its 2020 forecast for HPC products and services, projecting... Read more…

By Tiffany Trader

Weather at Exascale: Load Balancing for Heterogeneous Systems

March 30, 2020

The first months of 2020 were dominated by weather and climate supercomputing news, with major announcements coming from the UK, the European Centre for Medium- Read more…

By Oliver Peckham

Q&A Part Two: ORNL’s Pooser on Progress in Quantum Communication

March 30, 2020

Quantum computing seems to get more than its fair share of attention compared to quantum communication. That’s despite the fact that quantum networking may be Read more…

By John Russell

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Conversation: ANL’s Rick Stevens on DoE’s AI for Science Project

March 23, 2020

With release of the Department of Energy’s AI for Science report in late February, the effort to build a national AI program, modeled loosely on the U.S. Exascale Initiative, enters a new phase. Project leaders have already had early discussions with Congress... Read more…

By John Russell

Servers Headed to Junkyard Find 2nd Life Fighting Cancer in Clusters

March 20, 2020

Ottawa-based charitable organization Cancer Computer is on a mission to stamp out cancer and other life-threatening diseases, including coronavirus, by putting Read more…

By Tiffany Trader

Kubernetes and HPC Applications in Hybrid Cloud Environments – Part II

March 19, 2020

With the rise of cloud services, CIOs are recognizing that applications, middleware, and infrastructure running in various compute environments need a common management and operating model. Maintaining different application and middleware stacks on-premises and in cloud environments, by possibly using different specialized infrastructure and application... Read more…

By Daniel Gruber,Burak Yenier and Wolfgang Gentzsch, UberCloud

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, t Read more…

By John Russell

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

IBM Unveils Latest Achievements in AI Hardware

December 13, 2019

“The increased capabilities of contemporary AI models provide unprecedented recognition accuracy, but often at the expense of larger computational and energet Read more…

By Oliver Peckham

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

IBM Debuts IC922 Power Server for AI Inferencing and Data Management

January 28, 2020

IBM today launched a Power9-based inference server – the IC922 – that features up to six Nvidia T4 GPUs, PCIe Gen 4 and OpenCAPI connectivity, and can accom Read more…

By John Russell

University of Stuttgart Inaugurates ‘Hawk’ Supercomputer

February 20, 2020

This week, the new “Hawk” supercomputer was inaugurated in a ceremony at the High-Performance Computing Center of the University of Stuttgart (HLRS). Offici Read more…

By Staff report

Summit Joins the Fight Against the Coronavirus

March 6, 2020

With the coronavirus sweeping the globe, tech conferences and supply chains are being hit hard – but now, tech is hitting back. Oak Ridge National Laboratory Read more…

By Staff report

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This