Is D.C. Ready for Cloud Computing?

By Derrick Harris

October 22, 2008

We already know the Department of Defense is sold on cloud computing as long as it remains internal, but what about the federal government overall? The word is that it definitely has been discussed, and at least one systems integrator thinks it can get the government on the public clouds.

The Plan

The systems integrator in question, Apptis, is working on a model that would allow government agencies to take advantage of public cloud computing offerings. According to Cameron Chaboudy, director of advanced and emerging technologies at Apptis, the offering will ease datacenter pressures by allowing agencies to operate in a hybrid computing model, offloading less-security-sensitive applications to the cloud.

Apptis CTO Phil Horvitz seizes on the cloud’s ability to handle “surges” in need, also referred to as cloudbursting. “That’s where you see the tremendous benefits of cloud computing,” he says, citing the Department of Homeland Security as a possible user in the case of a disaster. “We’re proposing a hybrid approach, where they keep their existing infrastructure and, upon surge or upon load, they go and leverage a cloud,” he explains. “It acts more like a supercharger to the application.”

Because the end result of Apptis’ efforts will be offered on providers’ own infrastructures, government users will be billed just like regular customers of whatever service they choose to use (Apptis has been in discussions with ServerVault, Amazon and Google, at least), but there will be value-adds, as well. For starters, special security considerations put in place beforehand, and Apptis hopes to be culpable for making sure SLAs are met, said Chaboudy (providers like Amazon and Google, with its first-generation App Engine offering, have conspicuously absent SLAs). In addition, he says, “We’ll watch the government systems and be able to make adjustments if we see degradation and work with the cloud providers that way to take that out of the government’s need so they can leave their resources freed up to do the [day-to-day operations].”

Aside from the extras Apptis will provide, Horvitz sees this government cloud initiative as fertile ground for other consultants, too. Most cloud providers don’t offer managed services, he notes, so companies will be able to jump in the middle and provide higher availability, cloud-enablement of existing applications and other services that might be outside the knowledge base of agency employees.

IT Budgets Stressed in Washington, Too

Like most IT organizations, government agencies are feeling pressure to change their IT acquisition models because of budget constraints, says Chaboudy. Demands keep increasing, but government IT departments have neither the money nor the willingness to just keep building new datacenters. CIOs see that things are out of control, with datacenters that are “exploding” and operational expenditures growing rapidly, and Horvitz says cloud is a great solution for addressing these concerns. Why not, he asks, buy computing for $200 per hour and have someone manage it for you rather than paying $30 million building a new infrastructure?

So far, he says, feedback has been “tremendous,” especially from CIOs (who really like now-legendary price-performance stories like what the New York Times did with Amazon EC2). Acknowledging that guys who manage the datacenters can find a million reasons to shoot this concept down, Horvitz says CIOs are listening because of their aforementioned budget pressures. Chaboudy said that Apptis usually pitches at the IT level because solutions are targeted toward solving a specific problem, but the cost advantages of cloud computing have made it more of a CIO-level pitch.

But it’s not exclusively about saving money; overall efficiency also factors into government excitement over cloud computing. Chaboudy says the paradigm helps them maximize use of current infrastructure (presumably because excess capacity can be leveraged thanks to the knowledge that the cloud is there for any additional needs), and the fact that most cloud offerings today have limited development options might encourage agencies to utilize standard operating systems and languages.

Notions of workplace respect, too, are among the reasons for adopting cloud computing. Horvitz says that the government eventually will develop standards for the delivery model, and while Apptis is taking the initiative to help define what those standards will look like, some CIOs see an opportunity to step up, show they are innovative and do the right thing early on in the process.

Umm … Security?

Let’s not be naïve, though. You can’t talk about what the government wants without mentioning the elephant in the room: security. Early on, Apptis spotted security and trust as major challenges with getting our federal agencies on board with cloud computing. As Chaboudy put it, “Nobody wants to be on the front page of the Washington Post [because of a security lapse].” Apptis’ government cloud offering, he adds, really is based around educating the various stakeholders within the government and helping them get over these security challenges.

Chaboudy says all government workloads are subject to some degree of regulations, but Apptis thinks it can adapt in the near term to get the government leveraging external clouds, at least minimally, sooner rather than later. In the longer term, Apptis wants to help define the currently nonexistent government policies and procedures for cloud computing. Horvitz cites FISMA (Federal Information Security Management Act) as a regulation that still tells agencies what they can and cannot do, but was written before cloud computing was even an option.  He says organizations like the Information Technology Association of America (ITAA) are looking into what federal cloud standards might look like and “[p]robably within a year or two, you’ll see the recommendations coming out of ITAA for what you have to have to be compliant for a commercial cloud for the government.”

In the end, Chaboudy believes cloud computing might even spur an evolution in the federal government’s security culture. “Traditionally, it’s been a system-based security thing, where now you’ve almost really got to break it apart and look at each individual component, and how you put security around each individual component versus an entire system,” he says.

As a concrete step to get the security piece right from the beginning, Apptis has consulted with ServerVault, a managed hosting provider with a targeted federal offering and much experience meeting security requirements like the aforementioned FISMA.

And even though Apptis is willing to put in a lot of work to make a cloud secure should an agency decide to use it, Horvitz says there are certain requirements cloud providers will have to meet.  One of those requirements (which likely will not fly in the DoD, at least), probably will be cordoning off a piece of the cloud specifically for government use. This allows certain security features to be wrapped around that section of infrastructure, Horvitz says, and will allow the appropriate parties to monitor where, exactly, the government’s data is being stored and processed. Additional requirements obviously will be put in place, which might include controlling access to the computer rooms housing the government machines.

Cloud Providers Are On Board

Apptis has talked to Amazon and Google, among other cloud providers, and Horvitz says they, too, have responded well. One reason is that providers already have federal sales divisions and are seeking ways to expand these divisions’ sales into cloud offerings, says Chaboudy. In the case of Google, government users already are using Google Docs and Google Apps, but in unrelated and, often times, unapproved manners. Horvitz says Google wants capture this revenue legitimately, but the previously discussed security and privacy concerns act as hindrances.

Apptis playing the part of middleman helps ease providers’ minds, too. “When I explain to them that they don’t have to do much to capture a lot incremental revenue from the federal government,” says Horvitz, “they’re very interested.”

Internally: Federated Infrastructure = Federated IT

Although Horvitz acknowledges the federal government has been slow in cloud uptake, it appears likely that cloud computing eventually will penetrate the federal government. The remaining question is what that strategy will look like: will each agency use public clouds however and whenever they see fit, or will each build its own cloud a la DISA (Defense Information Systems Agency)?

Horvitz says DISA’s RACE (Rapid Access Computing Environment) initiative is a big step in the right direction, and other agencies are taking notice and are starting to think about clouds when they buy new stuff. They will really reap the rewards when they can leverage public clouds, too, he added. (Apptis holds one of four processing contracts for RACE.)

DISA CIO John Garing has an idea of how the government might best take advantage of the cloud revolution. He suggests eliminating intra-agency IT departments and forming a single entity that provides IT services to all federal agencies. While DISA can serve its three military departments and four services easily enough with its cloud infrastructure, Garing suggests it might be a more challenging prospect within the Department of Homeland Security, for example, which has more than 20 divisions to manage. This solution would help to eliminate the heterogeneity, complexity and unnecessary costs that permeate datacenters, he says — concerns that DISA hopes its RACE cloud will curb within the DoD. “It seems to me that the successful CIOs … own IT — money and people,” he added, “and the business units don’t.”

Garing says the White House and Congress would have to initiate such a sweeping overhaul, and while they are interested, that it will happen is far from a guarantee. Even so, assuming interagency privacy concerns were met, Garing said, “If I were king for a day, I would definitely do something like that.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Supercomputer Modeling Shows How COVID-19 Spreads Through Populations

May 30, 2020

As many states begin to loosen the lockdowns and stay-at-home orders that have forced most Americans inside for the past two months, researchers are poring over the data, looking for signs of the dreaded second peak of t Read more…

By Oliver Peckham

SODALITE: Towards Automated Optimization of HPC Application Deployment

May 29, 2020

Developing and deploying applications across heterogeneous infrastructures like HPC or Cloud with diverse hardware is a complex problem. Enabling developers to describe the application deployment and optimising runtime p Read more…

By the SODALITE Team

What’s New in HPC Research: Astronomy, Weather, Security & More

May 29, 2020

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

DARPA Looks to Automate Secure Silicon Designs

May 28, 2020

The U.S. military is ramping up efforts to secure semiconductors and its electronics supply chain by embedding defenses during the chip design phase. The automation effort also addresses the high cost and complexity of s Read more…

By George Leopold

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI-based techniques – has expanded to more than 56 research Read more…

By Doug Black

AWS Solution Channel

Computational Fluid Dynamics on AWS

Over the past 30 years Computational Fluid Dynamics (CFD) has grown to become a key part of many engineering design processes. From aircraft design to modelling the blood flow in our bodies, the ability to understand the behaviour of fluids has enabled countless innovations and improved the time to market for many products. Read more…

What’s New in Computing vs. COVID-19: IceCube, TACC, Watson & More

May 28, 2020

Supercomputing, big data and artificial intelligence are crucial tools in the fight against the coronavirus pandemic. Around the world, researchers, corporations and governments are urgently devoting their computing reso Read more…

By Oliver Peckham

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI Read more…

By Doug Black

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers Read more…

By John Russell

Hats Over Hearts: Remembering Rich Brueckner

May 26, 2020

HPCwire and all of the Tabor Communications family are saddened by last week’s passing of Rich Brueckner. He was the ever-optimistic man in the Red Hat presiding over the InsideHPC media portfolio for the past decade and a constant presence at HPC’s most important events. Read more…

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia Read more…

By Doug Black

Microsoft’s Massive AI Supercomputer on Azure: 285k CPU Cores, 10k GPUs

May 20, 2020

Microsoft has unveiled a supercomputing monster – among the world’s five most powerful, according to the company – aimed at what is known in scientific an Read more…

By Doug Black

HPC in Life Sciences 2020 Part 1: Rise of AMD, Data Management’s Wild West, More 

May 20, 2020

Given the disruption caused by the COVID-19 pandemic and the massive enlistment of major HPC resources to fight the pandemic, it is especially appropriate to re Read more…

By John Russell

AMD Epyc Rome Picked for New Nvidia DGX, but HGX Preserves Intel Option

May 19, 2020

AMD continues to make inroads into the datacenter with its second-generation Epyc "Rome" processor, which last week scored a win with Nvidia's announcement that Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Contributors

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This