Feb 14 2012
Data Center

Beating the Data Center Heat

Containment strategies maximize cooling efficiency and reduce costs.

When it comes to knowing how to keep high-density equipment cool in the data center, Dave Martinez is quite an innovator. Martinez, distinguished technologist for Sandia National Laboratories’ Infrastructure Computing Services, has been tending to the labs’ three data centers for more than 25 years.

As early as the 1980s, Martinez created a homemade hot-aisle return ducting system by building containment curtains from the ceiling to the top of an air conditioning unit using welding curtain material from one of the labs’ data centers. As a result, a condition known as short-cycling was eliminated, so air was delivered more efficiently, and the fans operated more effectively.

As computers became denser and temperatures rose in the data center in the early 2000s, Martinez noticed that newly installed 14 kilowatt racks weren’t able to pull in cool air effectively, leading bottom nodes to overheat. He braced a piece of plastic over the top, which slowed the velocity of the air from the cooling system, creating even distribution of air across the racks. This reduced the row temperature from 72 degrees back to the suggested supply temperature of 60 degrees.

Martinez’s next step was even more similar to today’s hot aisle/cold aisle containment systems, which keep hot air exhausted from the servers separate from cool air that is fed into server intakes, maximizing cooling efficiency.

The IT staff used hot air balloon material to create a cold-aisle containment system. “With this containment system, we were able to lower the fan speeds on our 18 air conditioning units down to about 30 to 40 hertz, which saved an enormous amount of operational costs and reduced data center energy consumption,” explains Martinez. 

Over the past two years, Martinez’s team has begun taking advantage of commercially available hot aisle/cold-aisle containment systems in all three of the labs’ data centers. These include a 30,000-square-foot data center, a 3,000-square-foot facility and the main data center, which spans 28,000 square feet over seven rooms. Martinez is rolling out hot-aisle containment in all rooms with high-density equipment.

Many organizations are moving to hot aisle/cold aisle containment — available from manufacturers such as APC, HP, Tripp Lite and Black Box — to improve cooling efficiency and reduce costs.

“It’s pretty much a given for new build-outs because higher-density equipment is so common today,” says Jason Schafer, a research manager at Tier1 Research of Bethesda, Md. “Even five years ago, 1 to 2kW per rack was average, but now it can be 10kW per rack or more. That makes hot- and cold-aisle containment pretty important.”

It can be difficult to retrofit existing data centers with hot- or cold-aisle containment, although it’s not impossible. In many cases, it’s worth the effort, Schafer says.

Maintaining proper temperature is critical especially when you’re dealing with large storage systems that generate excessive heat.

That’s the case at the U.S. Holocaust Memorial Museum in Washington, D.C., and it’s something that Chief Technology Officer Chandra Chandrasekaran and the museum’s Operations Department have worked hard to achieve.

“We’ve been operating a data center for nearly 15 years, and as our equipment became more efficient and had higher density, we had to find a way to separate the cold aisles from the hot aisles,” Chandrasekaran says.

Because the data center is in a small, confined space, traditional hot aisle/cold aisle technology is not yet part of the plan. In the meantime, staff members in the museum’s IT and Operations departments have done their best to separate hot and cold aisles.

“We may get to hot- and cold-aisle containment at some point in the future,” Chandrasekaran says. “We certainly see the value in it.”

As for Sandia National Labs, innovation in data center cooling continues.  Although Martinez intends to continue using hot-aisle containment systems, he thinks the future may be liquid cooling.

Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT