Data Center Priorities for 2010

Priority Report: Data Center Solutions

Market researcher INPUT identifies five key technologies that will help federal agencies streamline their operations and meet Administration goals.

By Cara Garretson

As federal IT managers plan ahead, they’re looking for technologies to drive down data center costs while boosting productivity, decision making, and customer service levels throughout the organization.  According to a recent study by market researcher INPUT, there are five key technologies that have yielded similar benefits for private sector companies and are making their way into federal IT plans for 2010 and beyond.  These technologies are:

    • Cloud computing

    • Virtualization

    • Service-oriented architectures

    • Open-source software

    • Geospatial technologies

Outlined in INPUT’s Federal Industry Report: Emerging Technology Markets in the U.S. Federal Government, 2009-2014 that was published in February, these five technologies address IT-related goals set out by the Obama Administration;  cost savings, operational and energy efficiency, information sharing/interoperability, transparency, and agility/flexibility. The study surveyed ninety federal and industry IT professionals in October and November of 2009 who said they expect these five technologies to have a major impact on their environments over the next five years as they move toward data-center consolidation, plug-and-play environments, geographically distributed computing power, and information sharing.  

Acknowledging that federal agencies have beendeploying these five technologies on a project basis, the INPUT report says that cost pressures and administration mandates will result in wider spread adoption.  

“Over the next five years we’ll see some fairly rapid adoption of these technologies,” says Deniece Peterson, manager of industry analysis at INPUT, who was an author of the study. “While still a fraction of total federal IT spending, we’ll see high growth in these areas.”

Cloud Computing
One of the key benefits promised by cloud computing – defined as transferring some or all applications, compute functions, or storage from an organization’s data center to a third-party provider – is flexibility, both from the computing model’s usage-based payment structure and  its self-service provisioning capabilities. Because cloud providers offer their services on a pay-as-you-go basis, federal agencies can contain costs by paying only for what they use, and self-provisioning lets them expand and contract computing resources as needed.

“Over the next five years we'll see some fairly rapid adoption of these technologies.”
Deniece Peterson, manager of industry analysis, INPUT.

One factor that may ease federal agencies’ move to cloud computing services is the fact that they can be paid for out of operations and maintenance budgets, which tend to be more accessible to federal IT managers than capital investment funds, says Peterson.

However, there are still challenges regarding cloud computing that federal IT departments must figure out, such as ensuring sufficient security and privacy of data stored in the cloud, how to budget for flexible cloud-computing fees, and what the procurement methods should be.

Virtualization – technology that abstracts computing resources and separates out the operating system so the underlying machine can run multiple applications – reduces hardware, energy, and support costs.  By consolidating the number of servers needed to run a data center, the amount of management required is also decreased. While virtualization is becoming easier to implement with more management tools available on the market, federal agencies will still need to grapple with the complexity of virtualizing legacy systems, find funding to reprogram for virtualization, and seek out IT professionals with the necessary skills to run these systems, according to INPUT.

Service-Oriented Architectures (SOAs)
These collections of computing services that communicate and work together to provide functions help agencies reduce integration costs and aid in speeding the development of new applications while encouraging reusability.  The challenges organizations face when implementing SOAs include back-end complexity, a lack of unifying standards, and uncertainty regarding security, storage, bandwidth, and capacity planning issues, according to the report.

Open-source software
By shifting more projects to an open-source software base, agencies can reap the benefits of lower licensing fees, more agile application development, and greater code reuse. And, as is the case with cloud computing, open-source software can be funded out of operations and maintenance, rather than capital, budgets, says Peterson. Challenges with this technology include portability issues, ensuring code is free from malware, and support costs.

What could help spur open-source adoption is the fact that leadership in the Department of Defense (DoD) has released guidance encouraging the use of open-source software, and some of its components such as the Navy are already implementing it. “This stamp of approval will formally open the doors to the rest of DoD,” reads the INPUT report. “Civilian agencies, which often follow in the footsteps of DoD, will look to DoD as a ... model for determining what to do and what not to do.”

Geospatial technology
Traditionally geospatial technology has been used to acquire and manage data that focuses on geographic,temporal, or spatial contexts by agencies such as Defense, Homeland Security, and the Environmental Protection Agency. But there’s a new use for this technology that enables agencies to collect, visualize, and map out complex ideas and concepts so they can be easily shared and understood by others, says Peterson. This leads to better business intelligence and improved service delivery. However, agencies will also have to deal with the implementation costs, uncertain return on investments, and the challenges of transparently expanding the volume of data that comes with this technology.