Government Takes Lead in Cloud Adoption

Cloud Computing
Government Takes Lead in Cloud Adoption

By Barbara DePompa

In an unlikely role reversal, speakers at the recent Cloud Computing Summit in Washington D.C., agreed the public sector has taken an obvious lead in the adoption of cloud computing, stepping ahead of private sector organizations by embracing pay-per-use solutions for a variety of new services and applications.

Even though the majority of publicly acknowledged government cloud computing investments are still in early phases of implementation, industry suppliers, observers and government officials all said the promise of lowered costs and greater efficiencies has accelerated the pace of adoption despite obvious security, legacy systems integration and governance challenges.

The pay-as-you-go benefits are so compelling, the 2010 federal budget submitted to Congress expanded the use of cloud computing, and included a reduction in the number and cost of federal data centers. The White House advises agencies to launch cloud computing pilot tests for applications ranging from communications and remote access, to virtual data centers, analytics/reporting, web portals, collaboration as well as records and case management.

Industry observers cite the Open Government Directive and the apps.gov website as a prime cloud-based example. NASA, meanwhile, has launched Nebula, a home-grown cloud computing environment that allows outside scientists to contribute. DISA has the RACE program, which is being used to test cloud services. The National Highway Traffic Safety Administration (NHTSA) ran its highly publicized ‘Cash for Clunkers’ program on a cloud computing service. The City of Los Angeles outsourced e-mail to a cloud-based solution. Michigan opted for a storage-as-a-service solution.

And as a shared services provider, the Department of Interior’s National Business Center (NBC) is busy rolling out cloud-based offerings for federal agencies. “The feds should be proud of the work they’ve done so far,” said Jon Oltsik, a senior principal analyst for Milford, Mass.-based ESG. “Great leading-edge cloud implementations exist in government today, while many in the private sector still [seemingly] aren’t getting the message.”

Cloud computing “presents a new business opportunity, an understanding that we can take the knowledge housed inside government organizations and present that in a cloud-run service,” said Susan Camarena, chief knowledge officer for the Transportation Department’s Federal Transit Administration (FTA).

Driving an unusually speedy migration is the government’s underlying budgetary constraints, as well as a desire to achieve greater flexibility, efficiency and support for new applications and users, who increasingly expect high-speed availability and greater functionality. In research conducted by Enterprise Strategy Group, Inc., only 13 percent of 700 users surveyed believe current IT personnel aren’t capable of servicing the organization’s needs. “Behind this statistic is the realization that cloud computing investments aren’t technology-based, but absolutely business-driven decisions,” said NBC’s Oltsik.

Cloud computing also is attractive because it supports all users, no matter where they are located. Such services minimize inefficient infrastructure, while boosting initiatives such as Green IT, disaster recovery/COOP (continuity of operations) and Telework. The U.S. Marine Corps’ virtualization initiative, started in 2007, was driven by requirements for greater security, availability and recoverability, said Chip Brodhun, senior technologist/project director of emerging technologies for the USMC.

For instance, Brodhun said, “the COOP plan for Hurricane Katrina consisted of removable media, packed into rucksacks for a 17-hour truck ride to Kansas City.” Now, after moving to a cloud-based disaster recovery service, “the Marines can be back up and running from almost any location, in just one to three hours,” he explained.



Meanwhile, the Los Alamos National Laboratory’s campus simply ran out of space for additional servers and had no further available power capacity, which drove the migration to virtualization, and then a cloud-based service now used to gather and report on energy consumption/savings, said Anil Karmel, a Solutions Architect for Los Alamos National Laboratory. In total, minus the power consumed by supercomputing operations, Los Alamos has realized a 25 percent reduction in power consumption, he explained.

Peter Mell, a senior computer scientist for the National Institute of Standards and Technology (NIST) and co-chair of the Cloud Computing Advisory Council, explained how high costs and hefty power consumption in traditional computing environments underscore the need to investigate cloud-based services. “Currently, $800 billion is spent annually on the purchase and maintenance of enterprise software, and 11.8 million servers run at only 15-20-percent capacity in data centers,” he said. “Meanwhile, the number of servers doubled between 2001 and 2006, while power consumption per server actually quadrupled during the same period.”

As a result, market researchers predict public sector investment in cloud computing is likely to more than double in the next five years. According to research from INPUT in Reston, Va., as the federal government modernizes IT infrastructures, agencies are investing in cloud computing as a viable alternative to buying and maintaining additional servers and software. Also, agencies don’t want to pay more than other organizations for the same commodity products and services. The groundswell from early adopters, combined with momentum created by senior officials promoting the cloud is helping drive the cloud’s 27 percent compound annual growth rate, according to INPUT’s research.

Obstacles Ahead
While the migration to cloud computing seems inevitable, security and privacy concerns are still seen as prominent obstacles. Doubts remain that externally controlled cloud services can be adequately protected and federal agencies must carefully scrutinize industry offerings to ensure adequate security. “Each investment in a cloud-based solution requires proper due diligence,” Oltsik said.

The security required is daunting. The Treasury Department was forced earlier this month to take down four public web sites for the Bureau of Engraving and Printing (BEP) after the discovery of malicious code on a cloud host site. The bureau began using a third-party cloud service provider to host the sites last year, according to a prepared statement. “The hosting company used by BEP had an intrusion and as a result of that intrusion, numerous websites (BEP and non-BEP) were affected.”

Meanwhile, the non-profit Cloud Security Alliance published a report on the biggest cloud computing security threats, based on information from security experts at 30 organizations involved in complex cloud environments.

Top threats include:
* Malicious employees of cloud computing providers;

*Nefarious use – attackers who target cloud providers;

* Insecure interfaces and APIs;

*Shared technology vulnerabilities;

*Data loss or leakage;

*Account, service or traffic hijacking; and

*Unknown risks.

According to Oltsik, security, legacy systems integration and governance especially related to contracting and service level agreements remain nagging concerns. “Standards would help,” Oltsik said, as industry suppliers “would be able to better respond if federal agencies worked together to define clear standards to delineate what and how services must be delivered.”

Early adopters said potential cloud customers must keep security, lifecycle management, chargeback, SLA and training issues uppermost in their minds as they negotiate cloud implementations. “It’s especially important to remember that there’s a much higher price tag associated with tier one systems that require 99.999 percent availability,” said Karmel from the Los Alamos Labs. He recommended a close inspection of current systems and applications destined for cloud deployments to ensure the applications that will function well at a tier two (90 percent uptime) are set to this level to avoid unnecessary expense in service level agreements.  



USMC’s Brodhun said until data and users can be easily moved into and out of cloud environments, it’s unlikely the Marines will invest in public cloud solutions, which are still widely perceived to be too high risk. Another challenge Brodhun mentioned is training, which requires additional investment in personnel and engineering resources, he said.

For now, it seems obstacles from security to privacy, reliability, standards, regulatory or legislative hurdles have all been “outweighed by the government’s overwhelming desire to reduce complexity and isolation and improve the sharing of information, applications, data and users,” said Javier Vasquez, director of Collaboration & Cloud at Microsoft Federal.  In the coming months, Vasquez said federal customers should look forward to a FISMA-certified solution from Microsoft that has been designed to address at least some federal IT security concerns and further smooth the transition to cloud-based environments.

As complexity and risks have grown, federal IT security mandates – especially the Federal Information Security Management Act (FISMA) – have remained largely focused on testing, evaluation and the accreditation of security solutions. This has created a situation in which federal organizations spent time and effort on filing paperwork and providing documentation for compliance, without gaining much in terms of security benefits. FISMA currently is being modified to focus more closely on continuous monitoring and measurement. “Once we achieve certification, we expect to see an uptick in interest among customers of cloud offerings,” said Vasquez, “as so much of the government’s business is predicated on FISMA accreditation.”