Air Force, University of Illinois Take Aim at Cloud Challenges

By Nicole Hemsoth

May 9, 2011

When natural disasters strike, oftentimes diverse nations heed the call of the country in peril sending both supplies and tactical support. Although there are pipelines to streamlines these rescue efforts, roadblocks can occur when the country in crisis has unstable relationships with the source of aid.

According to researchers supporting a new effort to improve military networks across borders using cloud computing resources, “staging such an operation would be risky without a cloud infrastructure that has secure properties.” As they note, assuring a successful mission in a possibly hostile environment that benefits from the communications, computations and applications of cloud computing isn’t possible without networks that operate seamlessly and securely and within a framework of trusted practices and standards.

To address complications such as these that routinely emerge in military contexts, the University of Illinois unveiled a new research initiative today aimed at creating a more secure, robust environment for military applications as they traverse government and third-party networks.

Coined the Assured Cloud Computing Center, the new program will be backed by $6 million from the U.S. Air Force Research Laboratory Technology Directorate (AFRL) which will work in tandem with the university and the Air Force Office of Scientific Research (AFOSR).

The center will be located within the University of Illinois Information Trust Institute where a team of dedicated researchers will set to work tackling some of the cloud’s most pressing issues, especially in the context of military applications living in the cloud.

The team’s most significant efforts will be concentrated on the matter of “blue” and “gray” networks and the associated problems of security, confidentiality, data integrity and communications—not to mention the general functionality of the applications that require such data protection-related scrutiny.

Dr. Roy Campbell, the Sohaib and Sara Abbasi Professor in the Department of Computer Science at Illinois provided details about the numerous distinctions between “blue and gray” networks. He stated in a release today that “A computational cloud used in military applications may include both blue and gray networks, where ‘blue’ networks are U.S. military networks, which are considered secure—and gray networks, which are those in private hands or perhaps belong to other nations that are considered unsecure.”

Campbell noted that these distinctions and the concerns they bear are critical considerations for the future of military cloud computing because for some military goals, there will be benefits to coordinating computation across a blend of these two resource types.

To follow up with the announcement of the Assured Cloud Computing Center, we asked Dr. Campbell a few additional questions about the scope of cloud security problems, especially as they relate to military applications and touched on some tangential matters, including how this research will extend to the clouds of the future.

HPCc: Give us a personalized account of the current state of cloud computing security: is it overhyped as a problem–after all, there are also potential breach possibilities with in-house systems for the U.S. military. In other words, what specific security problems are involved with military cloud computing?

Campbell: The current state of cloud computing security is clearly lacking as has been demonstrated recently (say by Sony).  Whether the state of cloud computing is overhyped depends on what are risks and costs of compromise.

The model of a cloud computing environment is evolving quickly. The Air Force must be able to conduct network-centric warfare as well as missions of national importance. Clearly, in many circumstances, assurances in the forms of security and dependability are crucial to the successful outcome of the mission. Now, however, throw into the mix the need for the Air Force to perform international operations using both military and non-miltary IT resources and you have additional complexity. To this end,  Assured Cloud Computing has to be end-to-end and cross-layered. It has to operate over multiple security domains.  Now, when the lives of personnel of the Air Force  and our national interest may depend on the correct functioning of the cloud,the need for assured cloud computing becomes a priority.

HPCc: Why is the Air Force so keen on the clouds? What is the advantage for them to have remote access to applications?

Campbell: The Air Force depends very heavily on surveillance, remote sensing, drones, complex computer controlled weapon systems, and powerful computers capable of complex analysis.  Missions can be viewed as complex flows of information from sensors, through command and control, to actuation.

Speed and availability is of the essence. In conducting international missions, the Air Force may not have a complex network at its disposal.  In many emergency situations and natural disasters, infrastructure can be damaged and communications and operations may need other IT support.

Assured cloud computing gives the Air Force the advantage of being able to get the right resources for a mission from a range of available sources.  It clearly helps to provide the Air Force an edge that will allow them to succeed in their missions.

HPCc: Do you see increasing collaboration across the blue and gray computational networks? In other words, many often assume that military applications are housed exclusively on military networks–is this a hybrid cloud model you see emerging in the future (some mission-critical apps on in-house, blue machines) where the other less mission-critical/security-aware applications are being sent to a third-party provider? Give us a sense of this landscape.

Campbell: When the military is conducting a mission, a successful outcome is paramount. The question becomes what does it take to conduct the mission and what is available to allow that to happen. We have already observed natural disasters that have taken out critical infrastructures that are vital to rescue missions (for example in Japan.)  When that happens, the military needs the ability to use resources.

HPCc: There are a lot of references to “blue” and “gray” networks and issues revolving around cloud security as a loose concept, but let’s get more specific–where do you start tackling some of cloud’s security issues? There are so many layers that are involved so better yet, what is the first/most important item of business for your research team on this security front?

Campbell: There are lots of security solutions to problems but knowing how they apply to a particular system and being able to use them for a specific mission is difficult. I expect we will find problems for which we cannot yet provide a solution and our researchers will have to investigate. Firewalls, IDS systems, encryption technology, access controls are all resources we can use. But the problem is getting a mission completed and what it takes to do it in an assured manner.  

One technology we will definitely be deploying is the modeling and simulating of systems to understand better what are the vulnerabilities and problem. We will also be looking at more appropriate access controls that can be deployed across mixtures of blue and gray networks and how we can monitor systems for better security analysis. I expect quite a lot of our first year in this grant will be collaborating with our Air Force researchers in understanding the complete spectrum of the problems faced by our Air Force and documenting them in terms of what technologies can be sued to solve them.

HPCc: What lessons from this initiative can be passed along to the public eventually–are there some core security or other developments you’re working with that will find their way into public cloud provider arsenals? Explain in other words the “trickle down” effect that you think might happen.

Campbell: We have developed clouds as a means of providing humanity an inexpensive and pervasive means of computation and communication. What we haven’t done yet, and our center hopes to address, is how to provide that computation and communication in a manner that is trustworthy and available…. that is assured for the various missions that humanity might need in the future. This Air Force initiative is an important first step.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Google Cloud Makes Good on Promise to Add Nvidia P100 GPUs

September 21, 2017

Google has taken down the notice on its cloud platform website that says Nvidia Tesla P100s are “coming soon.” That's because the search giant has announced the beta launch of the high-end P100 Nvidia Tesla GPUs on t Read more…

By George Leopold

Cray Wins $48M Supercomputer Contract from KISTI

September 21, 2017

It was a good day for Cray which won a $48 million contract from the Korea Institute of Science and Technology Information (KISTI) for a 128-rack CS500 cluster supercomputer. The new system, equipped with Intel Xeon Scal Read more…

By John Russell

Adolfy Hoisie to Lead Brookhaven’s Computing for National Security Effort

September 21, 2017

Brookhaven National Laboratory announced today that Adolfy Hoisie will chair its newly formed Computing for National Security department, which is part of Brookhaven’s new Computational Science Initiative (CSI). Read more…

By John Russell

HPE Extreme Performance Solutions

HPE Prepares Customers for Success with the HPC Software Portfolio

High performance computing (HPC) software is key to harnessing the full power of HPC environments. Development and management tools enable IT departments to streamline installation and maintenance of their systems as well as create, optimize, and run their HPC applications. Read more…

PNNL’s Center for Advanced Tech Evaluation Seeks Wider HPC Community Ties

September 21, 2017

Two years ago the Department of Energy established the Center for Advanced Technology Evaluation (CENATE) at Pacific Northwest National Laboratory (PNNL). CENATE’s ambitious mission was to be a proving ground for near- Read more…

By John Russell

Stanford University and UberCloud Achieve Breakthrough in Living Heart Simulations

September 21, 2017

Cardiac arrhythmia can be an undesirable and potentially lethal side effect of drugs. During this condition, the electrical activity of the heart turns chaotic, Read more…

By Wolfgang Gentzsch, UberCloud, and Francisco Sahli, Stanford University

PNNL’s Center for Advanced Tech Evaluation Seeks Wider HPC Community Ties

September 21, 2017

Two years ago the Department of Energy established the Center for Advanced Technology Evaluation (CENATE) at Pacific Northwest National Laboratory (PNNL). CENAT Read more…

By John Russell

Exascale Computing Project Names Doug Kothe as Director

September 20, 2017

The Department of Energy’s Exascale Computing Project (ECP) has named Doug Kothe as its new director effective October 1. He replaces Paul Messina, who is stepping down after two years to return to Argonne National Laboratory. Kothe is a 32-year veteran of DOE’s National Laboratory System. Read more…

Takeaways from the Milwaukee HPC User Forum

September 19, 2017

Milwaukee’s elegant Pfister Hotel hosted approximately 100 attendees for the 66th HPC User Forum (September 5-7, 2017). In the original home city of Pabst Blu Read more…

By Merle Giles

Kathy Yelick Charts the Promise and Progress of Exascale Science

September 15, 2017

On Friday, Sept. 8, Kathy Yelick of Lawrence Berkeley National Laboratory and the University of California, Berkeley, delivered the keynote address on “Breakthrough Science at the Exascale” at the ACM Europe Conference in Barcelona. In conjunction with her presentation, Yelick agreed to a short Q&A discussion with HPCwire. Read more…

By Tiffany Trader

DARPA Pledges Another $300 Million for Post-Moore’s Readiness

September 14, 2017

The Defense Advanced Research Projects Agency (DARPA) launched a giant funding effort to ensure the United States can sustain the pace of electronic innovation vital to both a flourishing economy and a secure military. Under the banner of the Electronics Resurgence Initiative (ERI), some $500-$800 million will be invested in post-Moore’s Law technologies. Read more…

By Tiffany Trader

IBM Breaks Ground for Complex Quantum Chemistry

September 14, 2017

IBM has reported the use of a novel algorithm to simulate BeH2 (beryllium-hydride) on a quantum computer. This is the largest molecule so far simulated on a quantum computer. The technique, which used six qubits of a seven-qubit system, is an important step forward and may suggest an approach to simulating ever larger molecules. Read more…

By John Russell

Cubes, Culture, and a New Challenge: Trish Damkroger Talks about Life at Intel—and Why HPC Matters More Than Ever

September 13, 2017

Trish Damkroger wasn’t looking to change jobs when she attended SC15 in Austin, Texas. Capping a 15-year career within Department of Energy (DOE) laboratories, she was acting Associate Director for Computation at Lawrence Livermore National Laboratory (LLNL). Her mission was to equip the lab’s scientists and research partners with resources that would advance their cutting-edge work... Read more…

By Jan Rowell

How ‘Knights Mill’ Gets Its Deep Learning Flops

June 22, 2017

Intel, the subject of much speculation regarding the delayed, rewritten or potentially canceled “Aurora” contract (the Argonne Lab part of the CORAL “ Read more…

By Tiffany Trader

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

NERSC Scales Scientific Deep Learning to 15 Petaflops

August 28, 2017

A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according Read more…

By Rob Farber

Russian Researchers Claim First Quantum-Safe Blockchain

May 25, 2017

The Russian Quantum Center today announced it has overcome the threat of quantum cryptography by creating the first quantum-safe blockchain, securing cryptocurrencies like Bitcoin, along with classified government communications and other sensitive digital transfers. Read more…

By Doug Black

Oracle Layoffs Reportedly Hit SPARC and Solaris Hard

September 7, 2017

Oracle’s latest layoffs have many wondering if this is the end of the line for the SPARC processor and Solaris OS development. As reported by multiple sources Read more…

By John Russell

Six Exascale PathForward Vendors Selected; DoE Providing $258M

June 15, 2017

The much-anticipated PathForward awards for hardware R&D in support of the Exascale Computing Project were announced today with six vendors selected – AMD Read more…

By John Russell

Google Debuts TPU v2 and will Add to Google Cloud

May 25, 2017

Not long after stirring attention in the deep learning/AI community by revealing the details of its Tensor Processing Unit (TPU), Google last week announced the Read more…

By John Russell

Top500 Results: Latest List Trends and What’s in Store

June 19, 2017

Greetings from Frankfurt and the 2017 International Supercomputing Conference where the latest Top500 list has just been revealed. Although there were no major Read more…

By Tiffany Trader

Leading Solution Providers

IBM Clears Path to 5nm with Silicon Nanosheets

June 5, 2017

Two years since announcing the industry’s first 7nm node test chip, IBM and its research alliance partners GlobalFoundries and Samsung have developed a proces Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Graphcore Readies Launch of 16nm Colossus-IPU Chip

July 20, 2017

A second $30 million funding round for U.K. AI chip developer Graphcore sets up the company to go to market with its “intelligent processing unit” (IPU) in Read more…

By Tiffany Trader

Google Releases Deeplearn.js to Further Democratize Machine Learning

August 17, 2017

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last w Read more…

By John Russell

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

Amazon Debuts New AMD-based GPU Instances for Graphics Acceleration

September 12, 2017

Last week Amazon Web Services (AWS) streaming service, AppStream 2.0, introduced a new GPU instance called Graphics Design intended to accelerate graphics. The Read more…

By John Russell

Cray Moves to Acquire the Seagate ClusterStor Line

July 28, 2017

This week Cray announced that it is picking up Seagate's ClusterStor HPC storage array business for an undisclosed sum. "In short we're effectively transitioning the bulk of the ClusterStor product line to Cray," said CEO Peter Ungaro. Read more…

By Tiffany Trader

GlobalFoundries: 7nm Chips Coming in 2018, EUV in 2019

June 13, 2017

GlobalFoundries has formally announced that its 7nm technology is ready for customer engagement with product tape outs expected for the first half of 2018. The Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This