Security — The Dark Side of the Cloud

By Steve Campbell

January 25, 2010

Cloud computing is a new computing paradigm for many but for the rest of us it is simply today’s version of timesharing – Timesharing 2.0. On-demand or pay-for-usage has been the norm for many HPC organizations for several decades. These users either never had the budget for their own computing resources or the project only needed limited access to powerful compute resources.

In that sense, HPC users, like the biggest commercial users, already trust cloud computing with their proprietary applications, data and results. They have been using pay-for-usage services for years and many have evolved from the early days of timesharing. In many cases, supercomputing centers and government research labs provide the compute resources. If you are a commercial HPC user in oil & gas, financial services, manufacturing or other industry, the compute resources will probably be found in the corporate datacenter.

This HPC user community has pioneered the necessary tools to allocate, measure and control access to specific users and projects while protecting the users from unauthorized access or modification of applications and data or malicious erasure or premature disclosure of results. This community also developed sophisticated accounting and charge back software that kept track of everything from CPU cycles to memory usage to access time and storage used. Suffice it to say that HPC users are well ahead of their counterparts in the commercial datacenter, and the latter would do well to look toward the former for some guidance in this area.

Without a doubt, the biggest challenge to cloud computing is security – the dark side of the cloud. In the cloud paradigm, the user community does not or should not care about the physical side of business operations. In most cases, the physical infrastructure is housed, managed and owned by a third party, and you pay for resources used just like the electric and gas utilities. Despite all these wonderful capabilities and features, security remains as much of a concern for the HPC community as it does for the consumers concerned about protecting their identity and credit card information.

Imagine for a moment the business ramifications if the results of critical drug data or aircraft design were changed and compromised by malicious activity or they were released to the world prematurely. The real and intangible costs to your company can be devastating.

Threats to the network and information security have been occurring for decades, nothing new. However, the complexity and scale of attacks are rising at an alarming rate, presenting organizations with a huge challenge as they struggle to defend against this ever-present threat. Today, cybercrime is more lucrative and less risky than other forms of criminal activity. Threat levels and attacks are on the rise, striking more and more businesses. Estimates for disruption, data theft and other nefarious activities were pegged at a staggering $1 trillion for 2008. Certainly more than a round-off error!

Just this month, the news headlines in CNET News include “Google China insiders may have helped with attack” and in the Wall Street Journal: “Fallout From Cyber Attack Spreads.” CCTV.com reported, “China’s largest search engine paralyzed in cyber-attack….” And a ZDNet headline on Jan. 21 read: “Microsoft knew of IE zero-day flaw since last September.”

In July 2009, the associatedcontent.com headlines read “Near-Simultaneous Cyber Attacks Down U.S. Government Websites.” The article reported that the attack targeted the “White House, Pentagon, NYSE, Secret Service, NSA, Homeland Security, State, Nasdaq, Treasury, FAA, FTC, and DOT Websites.”

The low risk and low-cost of entry of cyber-crime make it an attractive and lucrative “business.” Cloud-based computing exacerbates the situation by facilitating access to increasing amounts of information. IT organizations have a hard enough time defending their in-house private cloud resources. Companies offering public cloud, pay-for-usage models are faced with a more difficult challenge since they must serve multiple organizations on the same platform. At the same time, there is an opportunity for innovation of flexible cloud-based security service offerings.

The criminal element employs powerful tools such as botnets, enabling attackers to infiltrate large numbers of machines. The “2009 Emerging Cyber Threats Report from Georgia Tech Information Security Center (GTISC)” estimates that botnet-affected machines may comprise 15 percent of online computers. Another report compiled by Panda Labs estimates that in the second quarter of 2008 10 million botnet computers were used to distribute spam and malware across the Internet each day. With the growth of the cloud paradigm, more and more mission critical information will flow over the Web to publicly-hosted cloud services. The conventional wisdom of defending the perimeter is insufficient for this dynamic distributed environment. One element in common across commercial enterprise applications is that users must consider security before signing up for public cloud services.

During SC09, I met with many of the HPC infrastructure vendors and also spoke with some real-world HPC cloud users about the concerns they have using cloud computing for their workloads. (This was not a structured industry survey.) Some did express concerns about security but mainly in the context of using public cloud resources versus their private cloud resources. However, they also expressed concerns about transitioning their HPC workloads from in-house resources to external public cloud resources, as it is a very different scenario and from commercial workloads. From a security standpoint the concerns ranged from unauthorized access to exposure of critical information to malicious activity. Additional concerns include the movement and encryption of data to public clouds and the subsequent persistence once workloads have been completed. Has the data really been deleted? It is all about the data integrity.

HPC users often have many options available for running their workloads. For example, an academic user may have access to in-house central computing resources shared between multiple departments, or even access to large-scale supercomputing centers. In this environment the user data, results and applications are still very much ‘in-house’ and even though there is some security risk, the users are better protected in this environment. HPC users in private industry, especially those in large-scale multinational companies, may have the option of private clouds available for their workloads, and like HPC academic users, have fewer security concerns. However, if the HPC user is looking at commercial third-party cloud providers of public clouds, whether it is Amazon’s EC2, Google’s App Engine or better still, HPC-specific cloud vendors, they should spend the time to ensure that these vendors fully address their security issues, encryption, and persistence.

To those organizations that do not have internal private clouds and want to use cloud computing from a third party vendor, I recommend you consider the following five security evaluation criteria:

  1. Evaluate the vendor’s security features very carefully. Ensure that they provide more than just password-protected access.
     
  2. Look into the collaboration tools and resource sharing to prevent data leakage. Security is all about the data.
     
  3. Look into authentication and the basic infrastructure security. What happens in the event of a disaster? What’s their disaster recovery plan, backup procedures and how often do they test this process? Has the provider ever had a failure or security breach and if so what happened?
     
  4. Can they build a private cloud for your workload? What’s their data persistence policy? Can they guarantee data transfer security form in-house resources to public cloud?
     
  5. Ask to review their best practices policy and procedures and check to see if it includes security audits and regular testing.

Cloud computing is not so much a new technology as it is a new delivery model, but its impact will be enormous. Research firm IDC estimated that worldwide cloud services in 2009 were $17.4 billion, and are forecasted to grow to $44.2 billion in 2013. The economies of scale and centralized resources create new security challenges to an already stressed IT infrastructure.

This concentration of resources and data will be a tempting target for cyber criminals. Consequently, cloud-based security must be more robust. Spend the time to evaluate the security and make sure it is designed in and not added on after a breach. Partner with a trusted vendor. And if in doubt, seek advice.

About the Author

Steve Campbell, an HPC Industry Consultant and HPC/Cloud Evangelist, has held senior VP positions in product management and product marketing for HPC and Enterprise vendors. Campbell has served in the vice president of marketing capacity for Hitachi, Sun Microsystems, FPS Computing and has also had lead marketing roles in Convex Computer Corporation and Scientific Computer Systems. Campbell has also served on the boards of and as interim CEO/CMO of several early-stage technology companies.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

How the United States Invests in Supercomputing

November 14, 2018

The CORAL supercomputers Summit and Sierra are now the world's fastest computers and are already contributing to science with early applications. Ahead of SC18, Maciej Chojnowski with ICM at the University of Warsaw discussed the details of the CORAL project with Dr. Dimitri Kusnezov from the U.S. Department of Energy. Read more…

By Maciej Chojnowski

At SC18: Humanitarianism Amid Boom Times for HPC

November 14, 2018

At SC18 in Dallas, the feeling on the ground is one of forward-looking buoyancy. Like boom times that cycle through the Texas oil fields, the HPC industry is enjoying a prosperity seen only every few decades, one driven Read more…

By Doug Black

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, produ Read more…

By John Russell

HPE Extreme Performance Solutions

AI Can Be Scary. But Choosing the Wrong Partners Can Be Mortifying!

As you continue to dive deeper into AI, you will discover it is more than just deep learning. AI is an extremely complex set of machine learning, deep learning, reinforcement, and analytics algorithms with varying compute, storage, memory, and communications needs. Read more…

IBM Accelerated Insights

New Data Management Techniques for Intelligent Simulations

The trend in high performance supercomputer design has evolved – from providing maximum compute capability for complex scalable science applications, to capacity computing utilizing efficient, cost-effective computing power for solving a small number of large problems or a large number of small problems. Read more…

New Panasas High Performance Storage Straddles Commercial-Traditional HPC

November 13, 2018

High performance storage vendor Panasas has launched a new version of its ActiveStor product line this morning featuring what the company said is the industry’s first plug-and-play, portable parallel file system that delivers up to 75 Gb/s per rack on industry standard hardware combined with “enterprise-grade reliability and manageability.” Read more…

By Doug Black

How the United States Invests in Supercomputing

November 14, 2018

The CORAL supercomputers Summit and Sierra are now the world's fastest computers and are already contributing to science with early applications. Ahead of SC18, Maciej Chojnowski with ICM at the University of Warsaw discussed the details of the CORAL project with Dr. Dimitri Kusnezov from the U.S. Department of Energy. Read more…

By Maciej Chojnowski

At SC18: Humanitarianism Amid Boom Times for HPC

November 14, 2018

At SC18 in Dallas, the feeling on the ground is one of forward-looking buoyancy. Like boom times that cycle through the Texas oil fields, the HPC industry is en Read more…

By Doug Black

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can Read more…

By John Russell

New Panasas High Performance Storage Straddles Commercial-Traditional HPC

November 13, 2018

High performance storage vendor Panasas has launched a new version of its ActiveStor product line this morning featuring what the company said is the industry’s first plug-and-play, portable parallel file system that delivers up to 75 Gb/s per rack on industry standard hardware combined with “enterprise-grade reliability and manageability.” Read more…

By Doug Black

SC18 Student Cluster Competition – Revealing the Field

November 13, 2018

It’s November again and we’re almost ready for the kick-off of one of the greatest computer sports events in the world – the SC Student Cluster Competitio Read more…

By Dan Olds

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

OpenACC Talks Up Summit and Community Momentum at SC18

November 12, 2018

OpenACC – the directives-based parallel programing model for optimizing applications on heterogeneous architectures – is showcasing user traction and HPC im Read more…

By John Russell

How ASCI Revolutionized the World of High-Performance Computing and Advanced Modeling and Simulation

November 9, 2018

The 1993 Supercomputing Conference was held in Portland, Oregon. That conference and it’s show floor provided a good snapshot of the uncertainty that U.S. supercomputing was facing in the early 1990s. Many of the companies exhibiting that year would soon be gone, either bankrupt or acquired by somebody else. Read more…

By Alex R. Larzelere

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

TACC Wins Next NSF-funded Major Supercomputer

July 30, 2018

The Texas Advanced Computing Center (TACC) has won the next NSF-funded big supercomputer beating out rivals including the National Center for Supercomputing Ap Read more…

By John Russell

IBM at Hot Chips: What’s Next for Power

August 23, 2018

With processor, memory and networking technologies all racing to fill in for an ailing Moore’s law, the era of the heterogeneous datacenter is well underway, Read more…

By Tiffany Trader

Requiem for a Phi: Knights Landing Discontinued

July 25, 2018

On Monday, Intel made public its end of life strategy for the Knights Landing "KNL" Phi product set. The announcement makes official what has already been wide Read more…

By Tiffany Trader

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

By John Russell

Leading Solution Providers

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

TACC’s ‘Frontera’ Supercomputer Expands Horizon for Extreme-Scale Science

August 29, 2018

The National Science Foundation and the Texas Advanced Computing Center announced today that a new system, called Frontera, will overtake Stampede 2 as the fast Read more…

By Tiffany Trader

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

Google Releases Machine Learning “What-If” Analysis Tool

September 12, 2018

Training machine learning models has long been time-consuming process. Yesterday, Google released a “What-If Tool” for probing how data point changes affect a model’s prediction. The new tool is being launched as a new feature of the open source TensorBoard web application... Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This