While the IC’s research organization looks into adding security to cloud environments, in the here and now, intelligence agencies are sharing more data.
Almost by definition, federal data centers are environmentally un-friendly.
They tend to be overrefrigerated, underutilized energy hogs. Ironically, that includes one run by the Energy Department’s National Renewable Energy Lab (NREL) in Golden, Colo., whose campus includes one of the greenest buildings — literally — in the world. To make its IT operation align with its eco-forward mission, the lab will next tackle its energy-guzzling data center.
NREL has kicked off a green plan for the 30-year-old center. And it promises that two other NREL buildings on the drawing board — a 2,500-square-foot facility that will open in 2010 and a 15,000-square-foot center slated for a 2011 ribbon cutting — will be energy misers, too.
“We will implement every best practice” in the new data center, says Chuck Powers, manager of NREL’s IT Infrastructure and Operations Group. To that end, the lab has begun virtualizing servers and deploying thin-provisioning storage, a method of optimizing space on storage-area networks. The lab also will appropriately size its uninterruptible power supplies, which become notorious sinkholes of energy when not properly matched with the equipment they’re meant to protect, Powers says.
In short, NREL is doing what every other agency is doing, in part by political mandate and in part by sheer necessity: It’s becoming a lean, green data-processing machine. The lab perhaps has a leg up on its peers, however, given that its scientists spend their days investigating ways of squeezing power out of everything from algae to hydrogen fuel cells.
The new NREL buildings that house the data centers will also be working showcases of energy efficiency wrapped in modern aesthetics, says Powers. The first of the two buildings, a 210,000-square-foot administration building that will be known as the Research Support Facility, will tap 750 kilowatts of solar power, a backup generator that runs on bio-diesel fuel, and a wood-burning heating plant that uses forest “thinnings” from Colorado’s Front Range to offset nearly 75 percent of NREL’s natural-gas use. “We also want to reuse the heat coming from the data center to warm the building,” adds Powers. The second building, a 130,000-square-foot computational-science center called the Energy Systems Integration Facility, is still in the early planning stages.
“The computer center has been in rented quarters for 30 years, so for us this is a shining moment,” NREL Client Services Group Manager Henri Hubenka says of the lab’s plans to build the green data center and other green buildings.
The solutions are cutting edge; the problems are not. As in most large data centers, NREL added servers ad hoc over the years, often using single servers for each application. “We’ll have to change how we manage IT,” says Powers. “We haven’t done a good job with matching resources with requirements in the data center.” There are about 250 servers in the data center now. After IT completes the consolidation and virtualization project now under way, only 30 to 40 servers will remain, yet they will supply the same computing power, he says. The plan is to build hot and cold aisles with targeted cooling equipment in the data center. That phase will begin after the lab completes an energy audit this spring, says Powers.
The shortcomings that prompted the lab to begin its data center greening effort put it on par with other data centers, both in and outside government, according to market researcher Gartner of Stamford, Conn. Windows and Linux servers often use only 15 percent of their CPU resources, and Unix servers often use no more than 25 percent, according to Gartner. That means 75 percent to 85 percent of a server’s power and cooling resources go to waste.
Those unused resources add up quickly. As of 2006, federal servers and data centers alone accounted for approximately 6 billion kilowatt hours, or 10 percent, of all data center electricity use annually, for a total electricity cost of about $450 million, according to Jonathan Koomey, staff scientist at Lawrence Berkeley National Laboratory and a consulting professor at Stanford University. That meant the federal government ranked sixth among the top 10 data center energy consumers in the United States, according to research by the Environmental Protection Agency; the other nine are all manufacturing sectors.
A combination of planned obsolescence and Moore’s Law makes IT equipment notoriously disposable. A flood of inexpensive servers, priced under $25,000 apiece, doubled the electricity use at private and public data centers between 2000 and 2005, Koomey says.
And, in the government, energy savings hasn’t been a mission factor. Department performance, rather than energy efficiency, has been the priority. “The cost of driving cold air into the data center has been low enough,” says Ken Baker, data center infrastructure technologist for Hewlett-Packard, noting that air conditioning often accounts for more than half of a data center’s electricity costs. “In the federal government’s eyes, data center availability has been more important than energy costs.”
In its first report to Congress last year on data center energy use, EPA came to the same conclusion. “Currently, energy use in data centers is mostly viewed as a problem of insufficient power or cooling capacity, which can only be solved by increasing the power and cooling infrastructure,” the EPA stated in its report. “This view ignores the fact that greater energy efficiency is usually the most cost-effective solution to solving these power and cooling constraints.”
Yet, there are great benefits in making the entire data center — servers, air conditioners, power supplies, lighting and all — green. The most direct and dramatic savings for any federal data center will come from changes to the heating, ventilation and air-conditioning (HVAC) systems. More than 50 percent of a center’s energy load comes from these, not from computational work.
Attitudes are definitely getting greener in government, and most certainly between Energy and EPA. In March, the oft-at-odds agencies announced the voluntary National Data Center Energy Efficiency Information Program, which integrates and coordinates several existing activities from the two, including the DOE Save Energy Now initiative and the EPA Energy Star program. The chief task? Create data center energy-efficiency benchmarks and certifications.
EPA this spring launched a 12-month effort to collect data from agencies about data center use. Ultimately, in 2010, the agency plans to use this data to create metrics that will then allow it to create an energy-efficiency designation for data centers. The move follows work begun by the Green Electronics Council, which manages the government’s Electronic Product Environmental Assessment Tool (EPEAT) product certification.
But EPA also is working to help agencies by crafting Energy Star specifications for servers, which would be a boon for agencies trying to make their data centers energy efficient. Certification is currently available for desktop and notebook systems and monitors. In mid-February, EPA released draft specifications for Energy Star certification for servers, and it expects to finalize them later this year. The goal is simple: Improve the energy efficiency of servers, monitors and related equipment because the less energy the devices use, the less heat produced and the less need for energy-sapping cooling devices.
For existing IT equipment, implementing server consolidation and other energy-management practices can reduce energy use by around 20 percent. Based on the assumption that the federal sector accounts for about 10 percent of electricity use and costs attributable to servers and data centers nationwide, the annual savings in electricity costs in 2011 to the federal government would range from $160 million to $510 million, according to the EPA. That alone could reduce the government’s annual carbon dioxide emissions by 1.5 million metric tons, to 4.7 million, within four years.
There is one slight snag, however. In the drive for efficiency and given the rising popularity of 1U rack servers, blade servers and multicore processors, the computing density per square foot rises, producing more heat and requiring more cooling. In fact, the purchase price of a new 1U server has been exceeded by the capital cost of power and cooling infrastructure to support that server and will soon be exceeded by the lifetime energy costs alone for that server, says Christian Belady, principal power and cooling architect at Microsoft and former distinguished technologist at Hewlett-Packard.
“Measure, measure, measure,” Belady says. “If you don’t measure, you won’t improve. I have always suggested that folks measure power-usage effectiveness — no matter how crude. That is how Microsoft started, and now we have tools with multiple metrics and a culture around efficiency.”
The lab may have other tricks up its sleeve to cool its data centers: It’s looking at free cooling, using a water-side economizer as a low-cost technique. It’s also studying ways to bring in outside air — weather permitting — to help cool the data center, says NREL’s Powers. “The good news is there’s less space to cool.”