Through consolidation, the House expects to boost server utilization from 7 percent to 60 percent, says Jack Nichols of the Office of the Chief Administrative Officer.
Sep 12 2008

The Incredible Shrinking Data Center

With virtualization technology, agencies reduce the size of the government's IT footprint.

In the nation’s capital, battles over power are a way of life. But two years ago, the House of Representatives faced a different kind of power struggle: It simply could not pump enough electricity into its aging primary data center to keep the center going and growing.

The center provides IT services for all 435 House members and their staff and supports public House websites. Reducing the services offered was simply not an option. There was only one solution: Use virtualization to consolidate servers and reduce the energy they consumed.

Fortunately, the timing was excellent. On the Hill, the “Green the Capitol” initiative had just launched. Around the same time, Congress passed HR 5646, empowering the Environmental Protection Agency to explore and promote the benefits of energy-efficient computing.

“We were initially working the problem purely from a power standpoint,” says Jack Nichols, manager of the Enterprise Technology Branch for the House’s Office of the Chief Administrative Officer (CAO). “When the Green Capitol initiative hit, it gave us the wind at our backs to carry the initiative through.”

Over the next six months, CAO will roll out machines into its newly renovated data center: primarily Sun Fire X4600s powered by AMD processors and running VMware ESX virtualization software, which allows a single machine to run a dozen or more operating systems and applications simultaneously.

Nichols says using VMware will let the House data center boost server utilization — the amount of time machines actually work — from less than 7 percent to more than 60 percent. CAO will also be able to reduce the servers in its test environment and in production to a number “well under three figures,” says Nichols, slashing energy consumption by as much as 75 percent. Fewer machines also produce less heat, so for every watt saved in computing power, Nichols estimates the House will save at least half a watt in energy needed to keep the center cool.

But the House is hardly alone. Across the board, agencies are discovering the benefits they can derive by consolidating and virtualizing servers, increasing efficiency while reducing cost and energy consumption. Better yet, many roll the costs of consolidation into their normal lifecycle upgrade budgets, boosting the return on investment even more.

Leader of the Pack

For instance, over the next two years, the Navy will consolidate 2,600 physical servers that support the Navy Marine Corps Intranet, which, given its 700,000 users, is one of the government’s largest networks. The move will save the Navy more than $1.6 million annually in energy costs alone, says Brandon Kern, NMCI server infrastructure manager for EDS, primary contractor for the network.

Meanwhile, the Defense Information Systems Agency has begun collapsing 4,000 servers. That will let the agency scale down from 18 data centers worldwide to 13, many running identical hardware platforms that DISA can manage remotely.

“The largest cost driver for us is probably labor,” says Alfred Rivera, director of computing services for DISA. “The more standardized your hardware and software solutions, the more efficient you can be with labor because fewer people can manage more environments.”

Snapshot: Three Consolidations
Agency Servers at start Servers at end Projected benefits
House Office of the Chief Administrative Officer 450 100 or fewer Cut power consumption by at least 45%; boost utilization rates
Navy Marine Corps Intranet 2,600 300 to 400 Cut energy costs by $1.6 million annually; improve uptime and performance
Defense Information Systems Agency 4,000 As few as possible; 8 to 15 virtual environments per server Reduce number of data centers; cut labor costs; boost remote management capability

Though the savings are attractive, the IT service benefits derive from improved uptime and better performance, says Kern. With physical servers, a center must take them offline every time IT needs to upgrade or patch them. Using the VMware VMotion data migration tool, NMCI can move applications easily to other servers on the network and continue delivering IT service without interruption.

Rivera notes that DISA’s virtual environment will let it provide on-demand computing services for thousands of Defense Department users worldwide. If a DOD organization needs a new service offering, it can provision one on the fly, then shut it down when it’s no longer needed — far more quickly and cheaply than configuring physical servers, he says.

Preparing for Change

Moving to a virtualized environment requires careful and intricate planning, says Steve Fink, senior infrastructure architect for Avanade, which has consulted with DISA and other agencies on consolidation schemes.

“Your organization needs to have achieved a certain level of maturity or be designed from the ground up as a virtual environment, or you’re going to hit a lot of pain points,” Fink says.

Photo: Randall Scott
DISA's Alfred Rivera
says his agency's virtual environment will let it provide on-demand
computing services for thousands of Defense
users worldwide.

CAO’s Nichols recommends establishing a baseline for performance first, to see how machines perform under actual loads and which applications work well in a virtual environment. Resource-hungry apps, such as large databases or Microsoft Exchange servers, generally still require a dedicated machine. Security and regulatory concerns may also prevent some applications from running alongside others on the same host.

Virtualization requires identical — and powerful — hardware platforms across the data center, especially if an organization wants to take advantage of data migration tools. Some of the money saved by buying fewer physical servers ends up being spent on faster processors, more memory and massive storage area networks.

“To get the full benefits of high availability and resource allocation, NMCI needed to become 100 percent storage-dependent,” says Kern. “In our initial deployment, we went from zero SAN storage to 30 terabytes. This is a major cost that can come back to bite you if you’re not prepared for it.”

Finally, there’s the human factor. Working with virtual machines makes managing software licenses, tracking IT assets and backing up data more complex. “Training is paramount,” says Fink. “You’ve got to make sure your people understand the implications of operating in a virtualized environment.”

The biggest barrier may be cultural, notes Kern, as IT pros shift from seeing servers as boxes sitting on a rack to something far less tangible — and perhaps more exciting.

“My thinking had to completely change,” he says. “Servers are no longer something you can just reach out and touch. Now there are 300 to 400 server administrators who also need to change their way of thinking.”

Photo: Joshua Roberts