Why Data Center Consolidation Drives ROI
Less risk: What agency isn't looking for that when it comes to IT infrastructure? And less risk is proving to be one of the early benefits of agencies' efforts to reduce their data center footprint.
"Moving to a shared environment spreads costs and risks throughout USDA, rather than each agency assuming these burdens ourselves," says Rory Schultz, chief technology officer of the Agriculture Department's Food and Nutrition Service. "Cloud computing is still evolving, and there is still much potential to tap."
FNS and other agencies, including the Environmental Protection Agency and the National Institutes of Health, are taking different paths to meet the goals of the Office of Management and Budget's Federal Data Center Consolidation Initiative, which calls for shuttering at least 962 data centers nationwide by 2015. And they're reaping a variety of benefits, such as improved availability and greater attention to the needs of the enterprise.
FNS started its consolidation efforts by moving its Microsoft Exchange environment to USDA's Enterprise Messaging System for e-mail, eliminating two mail servers and two gateways in the process.
The agency's data consolidation efforts also have improved its IT resiliency, notes Schultz, who serves as the director of FNS' Technology Division as well its CTO. And resiliency is certainly critical to an agency that administers nutrition assistance programs that serve 44 million Americans every month.
As it progresses with its consolidation efforts, FNS is also working on closing a commercial data center by migrating its contents to the USDA's National Information Technology Center in Kansas City, Mo. An enterprise environment, NITC provides platform as a service (PaaS) to various agencies and organizations by leveraging technologies from a host of partners, including Cisco Systems, HP, Juniper Networks, Microsoft, Red Hat and VMware.
The commercial data center contains four customer-facing web applications on 24 physical servers, divided into multiple virtual servers, plus associated networking hardware. Moving production and testing to NITC will reduce FNS's data center footprint by about 20 percent.
"We've been successfully running a public website from the NITC facility since June," Schultz says. "Although we're utilizing NITC's PaaS, we retain administrative access."
A team comprised of representatives from FNS' various divisions and NITC is shepherding both migrations. Led by seasoned project management professionals, the team follows project management best practices to carry out its duties.
Expert project management is critical, says Shawn P. McCarthy, director of research for IDC Government Insights. "Consolidation is about the classic project management triangle: scope, cost and timeline," McCarthy says. "You need top-notch project managers who know which of the three sides is most important in order to keep everything in sync."
Schultz agrees. "You need project managers with the fortitude to stick to their guns," he says. "Even when there is a push to go live, you have to insist on getting it right for the overall mission to succeed. And for consolidation efforts, it's important to manage goals rather than dates."
At EPA: A Cloud Is Born
Expert project management also plays a major role at the EPA, where consolidation began in 2008, prior to OMB's consolidation initiative.
To form its collaborative Computer Room, Server and Storage Management (CRSSM) team, EPA drew on IT operations managers from across the agency.
After consulting industry best practices, the CRSSM team established a strategy for local and national consolidation agencywide. The two-pronged strategy seeks to centralize applications that are used throughout the enterprise and optimize any remaining local resources.
Then, the CRSSM members changed hats and became the project management team for the national effort. Each team member also serves as the project manager for his or her local initiatives.
"We began by identifying four existing server rooms to serve as the primary data centers within the agency for enterprise applications," says David Updike, acting director of EPA's National Computer Center in Research Triangle Park, N.C., one of the four data centers.
Then, among the primary data centers, EPA established a private cloud, with the individual facilities providing redundancy for one another to ensure high availability and continuity of operations.
For the local efforts, EPA will aim to optimize technologies that each facility still needs onsite. "We'll consolidate multiple server rooms to one room on each campus," Updike says. "And, we'll drive the remaining servers toward virtualization."
Overall, EPA expects to reduce its 80 existing server rooms (which range from more than 2,000 square feet to less than 100) to about 60. Additionally, the number of physical servers will drop from about 1,950 to 1,000.
64%
Percentage of workload that federal IT professionals predict will be virtualized by 2015
SOURCE: "Consolidation Conundrum" (August 2011, Juniper Networks)
Like FNS, EPA began consolidating its enterprise applications with e-mail. It reduced more than 180 mail servers scattered across the country to about 20 within the four primary data centers, where solutions from F5 Networks handle e-mail load balancing, and a Cisco WAAS (Wide Area Application Services) solution performs WAN optimization. An HP 3PAR virtualized subsystem stores mail.
"We expect to complete the enterprise e-mail migration in the first quarter of 2012," Updike says. "Then, local campus optimization will occur over the course of the next three years."
At NIH: The Big-Picture Approach
The NIH also began its consolidation journey with widely dispersed server rooms located near its research facilities.
"We identified a total of 108 data centers distributed throughout the Washington, D.C., metro area, as well as in Montana and North Carolina," says Stacy Charland, acting deputy CIO of NIH. "Although the mandate defines a data center as larger than 500 square feet and 20 servers, we've chosen to define any room with a server as a data center."
By using such a broad definition, NIH aims to consolidate based on quality. "Our intent wasn't just to meet OMB's targets," Charland says. "We're looking at how to ensure we have the capacity necessary for working with huge volumes of research data and the large sizes of biomedical imaging files. For example, we don't want to close a very well-built smaller data center that could prove critical for future needs."
Still, NIH has already shut 14 data centers on the way to its target of closing 32 by 2015, a reduction that will total nearly 30 percent. "We used an objective scorecard measurement system to identify data centers for consolidation," Charland says. "And, in some cases, it makes more sense to wait until the next server refresh cycles before consolidating."
Ultimately, the Health and Human Services Department agency may go beyond current plans. To this end, NIH has formed a working group charged with developing agencywide data center standards. In addition to providing a roadmap for internal operations, the standards will help NIH assess vendors such as cloud providers.
Best Practices @
Looking for best practices on data center consolidation? Check out fedtechmagazine.com/
1111consolidation.
After completing the standards, NIH expects to conduct further evaluations.
"Although some local data centers may remain to enable high-performance computing, we're keeping a close eye on cloud technologies to see when cloud-based facilities" meet the requirements of the Federal Information Security Management Act and other regulations, Charland says.
Regardless, at an agency where 27 institutes and centers have historically acted independently, the process of meeting the consolidation mandate is reaping rewards and creating welcome cohesion.
"People are starting to shift their thinking from local needs to enterprise considerations," Charland says. "Seeing that shift is exciting indeed."